Phi-4

Microsoft · États-Unis
MIT-licensed 14B from Microsoft Research. Heavy use of synthetic training data is disclosed; English-primary (thin multilingual coverage). Strongest small-model option for permissive-licence EU deployments.
Caractéristiques de la licence
Licence
MIT
Commercial use
Unrestricted
Derivatives
Allowed
Attribution
Minimal
Parameters
14B dense
Training data
9.8T tokens, synthetic + filtered web + books + Q&A (partial)
Context
16K
Last updated
2024-Q4
Risques connus
  • English-primary, ~8% multilingual data — weak on EU non-English workloads
  • Synthetic-data heavy — benchmark performance may overstate real-world generalization
Revu par Ali Madjaji · Dernière revue le 2026-04-15