Login or Signup
Models
Developer Docs
Careers
Enterprise Solutions
Privacy Policy
Terms of Use
Blog
8
0
Description
The latest in the Phi model series: suitable for chats with a context of up to 16K tokens
Stats
24.5K Downloads
8 stars
Capabilities
Minimum system memory
Tags
Last updated
README
Supports a context length of 16k tokens.
Trained on 9.8 trillion tokens.
Trained on a mix of synthetic, filtered public domain websites, academic books, and Q&A datasets.
Sources
The underlying model files this model uses
Based on
GGUF
Product
Developer
Company
Legal