OLMo 2 32B

AllenAI · États-Unis
Fully open model: weights, training data (Dolma 2), training code, checkpoints, and logs all published. Apache 2.0 across the board. Strongest choice when AI Act transparency obligations matter.
Caractéristiques de la licence
Licence
Apache 2.0
Commercial use
Unrestricted
Derivatives
Allowed
Attribution
Required
Parameters
7B / 13B / 32B dense
Training data
Dolma 2 (fully published)
Reproducibility
Full (code + data + checkpoints)
Last updated
2025-Q1
Risques connus
  • Capability gap vs frontier models at large sizes
  • US-origin weights (supply chain)
Revu par Ali Madjaji · Dernière revue le 2026-04-15