Per the published model card, Granite 4.1 8B is an Apache 2.0 9B-parameter dense decoder with a 131k-token context, sourced from publicly-available datasets, internal synthetic data and human-curated material. IBM continues the unusual-for-the-industry training-data transparency that anchored the Granite 3 family, and offers IP indemnification when the model is consumed via watsonx — a strong default for regulated enterprise pilots that need a defensible weights-available alternative to hyperscaler frontier models.
Sovereignty
Licence: Apache 2.0Commercial: UnrestrictedTraining data: DisclosedOrigin: USA
Licence facts
Parameters
9B (8B class, dense)
Architecture
Decoder-only transformer with GQA, RoPE, SwiGLU MLP, RMSNorm
U.S. vendor jurisdiction — relevant only for users of IBM-hosted endpoints (watsonx); self-hosting the Apache 2.0 weights in EU infrastructure neutralises the data-transfer path.
Training-data disclosure is sourced-by-category (publicly available + synthetic + human-curated) rather than a per-dataset manifest — material enough for AI Act Article 53 mapping but lighter than fully-named-corpus references.