SmolLM3 3B

Hugging Face · USA
Fully open small model: Apache 2.0 weights, training data published, engineering blueprint public. 6 native languages (EN/FR/ES/DE/IT/PT) covers major EU markets. 128K context via YARN. Strong default for edge or on-prem EU deployments where transparency matters.
Licence facts
Licence
Apache 2.0
Commercial use
Unrestricted
Derivatives
Allowed
Attribution
Required
Parameters
3B dense
Context
64K trained, 128K via YARN
Languages
EN / FR / ES / DE / IT / PT (+ secondary AR/ZH/RU)
Training data
11.2T tokens (FineWeb-Edu, DCLM, FineWeb2-HQ — published)
Reproducibility
Full training blueprint published
Last updated
2025-Q3
Known risks
  • Small capacity: 3B won't match frontier quality on complex reasoning
  • HF org is US-incorporated despite French heritage — see Hugging Face provider entry
Reviewed by Ali Madjaji · Last reviewed 2026-04-16