Alibaba logo
Alibaba
Open Weights

Qwen3 4B 2507 (Reasoning)

Released Aug 2025

Intelligence
#235
Coding
#294
Math
#56
Context262K
Parameters4B

Qwen3-4B-Thinking-2507 is a compact language model developed by Alibaba Cloud's Qwen team, featuring 4 billion parameters. It is part of the Qwen3 series and is specifically optimized for reasoning-heavy tasks through an integrated "Thinking" mode. This version, released in August 2025, represents a significant update to the Qwen3 small-parameter line, aiming to provide high-depth logical and mathematical reasoning capabilities that are typically associated with larger models.

Capabilities and Reasoning

The model utilizes chain-of-thought (CoT) processing to solve complex problems, generating internal reasoning traces before producing a final response. This approach enhances its performance on benchmarks involving advanced mathematics, science, and coding. It is designed to handle multi-step deduction and exhibits improved alignment with human preferences in subjective and open-ended tasks. Unlike the standard instruction-tuned variants, the Thinking model prioritizes the quality of reasoning steps over raw generation speed.

Architecture and Context

Built on a causal language model architecture, Qwen3-4B-Thinking-2507 supports a native context length of 256,000 tokens, which can be extended for ultra-long context understanding. It features 36 layers and utilizes Grouped Query Attention (GQA) for efficient inference. The model is released with open weights under the Apache 2.0 license, making it suitable for deployment on consumer-grade hardware for localized complex reasoning applications.

Rankings & Comparison