Mistral Small 4

Mistral AI · France
Unified model folding Instruct, reasoning (Magistral) and code (Devstral) into a single 119B MoE under Apache 2.0. 6.5B active params, 256K context, 24 languages, toggleable reasoning effort. Strongest permissive EU option at this scale.
Caractéristiques de la licence
Licence
Apache 2.0
Commercial use
Unrestricted
Derivatives
Allowed
Attribution
Required
Parameters
119B MoE (6.5B active, 128 experts, 4 active)
Context
256K
Multimodal
Text + images (input)
Training data
Not disclosed; 24 languages
Last updated
2026-Q1
Risques connus
  • Training data undisclosed
  • Multi-GPU required for bf16 serving at full context
Revu par Ali Madjaji · Dernière revue le 2026-04-15