phi-4

Public

The latest in the Phi model series: suitable for chats with a context of up to 16K tokens

17.8K Downloads

4 stars

Capabilities

Minimum system memory

8GB

Tags

14B
phi

Last updated

Updated on May 24by
lmmy's profile picture
lmmy

README

phi 4 by microsoft

Supports a context length of 16k tokens.

Trained on 9.8 trillion tokens.

Trained on a mix of synthetic, filtered public domain websites, academic books, and Q&A datasets.

Sources

The underlying model files this model uses

Based on