Gemma 4 31B Instruct
Google DeepMind · United States
Based on published licence terms, Gemma 4 31B ships under pure Apache 2.0 — a notable break from the Gemma Terms of Use used in prior generations — with no prohibited-use carve-outs. US origin carries Schrems-II and CLOUD-Act exposure, and at 30.7B dense the model likely crosses EU AI Act GPAI systemic-risk thresholds that enterprise deployers should document.
Licence facts
- Parameters
- 30.7B dense + ~550M vision encoder
- Architecture
- Dense decoder with hybrid sliding-window + global attention, 60 layers
- Modality
- Text + image in, text out (no audio)
- Context length
- 256K tokens
- Data cutoff
- January 2025
- Released
- 2026-04-02
Known risks
- GPAI-with-systemic-risk threshold: 30.7B dense likely crosses the ~10^25 FLOP capability bar for enhanced EU AI Act Art. 55 obligations (systemic-risk evaluation, incident reporting). Deployers should verify Google's GPAI provider notification and rely on it.
- Training-data summary is domain-level only (web documents, code, maths, images, audio, 140+ languages, Jan 2025 cutoff) with no token count, dataset enumeration, or opt-out mechanism — thin for EU AI Act Art. 53(1)(d) 'sufficiently detailed summary' scrutiny.
- US controller — Google Cloud standard DPA and SCCs available for managed deployments; self-hosted weights avoid transfer concerns but CLOUD Act exposure applies to any Google-hosted endpoint used in fine-tuning or inference.
Sources
Reviewed by Ali Madjaji · Last reviewed 2026-04-17· Reviewed 0 days agoSuggest a correction