Side-by-side comparison of DBRX Instruct (Databricks · USA) and DeepSeek-V4-Flash (DeepSeek · China) for self-hosted deployment of the open-weight model. DBRX Instruct is rated conditional; DeepSeek-V4-Flash is conditional. They part ways on licence: DBRX Instruct is "Databricks Open", DeepSeek-V4-Flash is "MIT".
| Field | ||
|---|---|---|
| Summary | ||
| Verdict | Conditional 132B MoE (36B active). Databricks Open Model License is bespoke — allows commercial use with acceptable-use policy and a 700M-MAU-style cap. Read the licence carefully; not Apache 2.0. | Conditional DeepSeek-V4-Flash is the smaller-active sibling of V4-Pro under the same MIT terms — permissive on the weights, but the China-based vendor and undocumented training corpus mean any EU deployment still needs a self-hosted topology and a deployer-side GPAI documentation file under AI Act Article 53. |
| Last reviewed | 2026-04-15 | 2026-04-28 |
| Open-weight | ||
| Licence | Databricks Open | MIT |
| Commercial use | With caps | Unrestricted |
| Training data | Undisclosed | Categories only |
| Origin | USA | China |
| Performance & pricing? | ||
| Quality index | 8/100 | 47/100 |
| Speed | — | 79 tok/s |
| Blended price | — | $0.17/M |
| Context window | — | — |
| Evidence | ||
| Sources | ||
No overlapping sources between the two entries.