Side-by-side comparison of Qwen3-8B (Alibaba Cloud (Qwen) · China) and Qwen3.6-27B (Alibaba (Qwen)) for self-hosted deployment of the open-weight model. Qwen3-8B is rated conditional; Qwen3.6-27B is conditional. They part ways on training data: Qwen3-8B is "Token count only", Qwen3.6-27B is "Undisclosed".
| Field | ||
|---|---|---|
| Summary | ||
| Verdict | Conditional Based on published licence terms, Qwen3-8B is released under standard Apache 2.0 with no field-of-use carve-outs, making self-hosted commercial deployment viable. Training-data disclosure is limited to a token count and Chinese origin creates EU AI Act Art. 53 transparency and data-transfer risks that deployers should document. | Conditional Per the published Apache 2.0 licence, the Qwen3.6-27B weights are deployable without commercial restriction, including for vision-language and 1M-context workloads. The blockers for regulated EU use are the China-based vendor and the absence of any training-data disclosure on the model card — both should be mitigated through self-hosting and a deployer-prepared GPAI compliance file. |
| Last reviewed | 2026-04-17 | 2026-04-28 |
| Open-weight | ||
| Licence | Apache 2.0 | Apache 2.0 |
| Commercial use | Unrestricted | Unrestricted |
| Training data | Token count only | Undisclosed |
| Origin | China (Hangzhou) | China |
| Performance & pricing? | ||
| Quality index | 11/100 | 46/100 |
| Speed | 86 tok/s | 66 tok/s |
| Blended price | $0.31/M | $1.35/M |
| Context window | — | — |
| Evidence | ||
| Sources | ||
No overlapping sources between the two entries.