Alibaba logo
Alibaba
Open Weights

Qwen3.5 27B (Non-reasoning)

Released Feb 2026

Intelligence
#70
Coding
#70
Context262K
Parameters27B

Qwen3.5 27B is a dense, multimodal large language model developed by Alibaba Cloud's Qwen team and released in February 2026. As the primary dense variant in the Qwen3.5 series, it is designed to provide high-performance text and vision processing without the routing overhead associated with Mixture-of-Experts (MoE) architectures. The model distinguishes itself as the "non-reasoning" version, focusing on standard instruction-following and general-purpose assistance rather than the extended chain-of-thought reasoning found in the specialized "Thinking" variants of the same family.

Architecture and Capabilities

The model utilizes a hybrid architecture that incorporates Gated Delta Networks (a form of linear attention) alongside traditional transformer layers, allowing for high throughput and reduced latency during inference. It features early-fusion multimodal training, where vision and text tokens are processed within a unified foundation. This allows the model to perform sophisticated reasoning over complex visual inputs, including diagrams, charts, and scanned documents, with performance metrics that compete with significantly larger models from previous generations.

Qwen3.5 27B supports a native context window of 262,144 tokens, which can be extended up to 1 million tokens for processing long-form documents and extensive codebases. It provides comprehensive multilingual support for over 201 languages and dialects, making it suitable for global deployment. The model is particularly optimized for agentic tasks, tool-use, and coding, benefiting from scalable reinforcement learning conducted across massive multi-agent environments during its training phase.

Rankings & Comparison