Mistral Small 4
Mistral AI · France
Unified model folding Instruct, reasoning (Magistral) and code (Devstral) into a single 119B MoE under Apache 2.0. 6.5B active params, 256K context, 24 languages, toggleable reasoning effort. Strongest permissive EU option at this scale.
Licence facts
- Licence
- Apache 2.0
- Commercial use
- Unrestricted
- Derivatives
- Allowed
- Attribution
- Required
- Parameters
- 119B MoE (6.5B active, 128 experts, 4 active)
- Context
- 256K
- Multimodal
- Text + images (input)
- Training data
- Not disclosed; 24 languages
- Last updated
- 2026-Q1
Known risks
- Training data undisclosed
- Multi-GPU required for bf16 serving at full context
Sources
See also
Reviewed by Ali Madjaji · Last reviewed 2026-04-15