OLMo 2 32B
AllenAI · USA
Fully open model: weights, training data (Dolma 2), training code, checkpoints, and logs all published. Apache 2.0 across the board. Strongest choice when AI Act transparency obligations matter.
Licence facts
- Licence
- Apache 2.0
- Commercial use
- Unrestricted
- Derivatives
- Allowed
- Attribution
- Required
- Parameters
- 7B / 13B / 32B dense
- Training data
- Dolma 2 (fully published)
- Reproducibility
- Full (code + data + checkpoints)
- Last updated
- 2025-Q1
Known risks
- Capability gap vs frontier models at large sizes
- US-origin weights (supply chain)
Reviewed by Ali Madjaji · Last reviewed 2026-04-15