3.2K Downloads

microsoft/
phi-4
14B
phi

The latest in the Phi model series: suitable for chats with a context of up to 16K tokens

Last Updated26 days ago
README

phi 4 by microsoft

Supports a context length of 16k tokens.

Trained on 9.8 trillion tokens.

Trained on a mix of synthetic, filtered public domain websites, academic books, and Q&A datasets.

sources

The underlying model files this model uses

Based on

When you download this model, LM Studio picks the source that will best suit your machine (you can override this)

config

Custom configuration options included with this model

No custom configuration.