Alibaba logo
Alibaba
Open Weights

Qwen3 8B (Reasoning)

Released Apr 2025

Intelligence
#327
Coding
#300
Math
#212
Context131K
Parameters8.2B

Qwen3 8B (Reasoning) is an 8.2 billion parameter language model developed by Alibaba Cloud's Qwen team. Released as part of the Qwen3 series, the model is designed with a Hybrid Reasoning architecture that allows it to toggle between a high-speed "non-thinking" mode for general dialogue and a "thinking" mode for complex logical tasks. This reasoning capability is integrated directly into the model, enabling it to generate intermediate chain-of-thought (CoT) steps within <think> tags for mathematics, programming, and symbolic logic.

Architecture and Training

The model is a causal transformer with 36 layers and employs Grouped-Query Attention (GQA) for inference efficiency. It was trained on a massive dataset of 36 trillion tokens, representing a significant increase over its predecessor, Qwen2.5. The training corpus spans 119 languages and includes a heavy emphasis on STEM, coding, and high-quality synthetic data to enhance its multi-step reasoning performance.

Key Capabilities

Qwen3 8B supports a native context window of 32,768 tokens, which can be extended up to 131,072 tokens using techniques such as YaRN. It natively supports the Model Context Protocol (MCP) and robust function-calling, making it suitable for complex agentic workflows. In benchmarks, the reasoning-enabled variant demonstrates performance levels in mathematical reasoning (AIME) and coding proficiency (LiveCodeBench) that rival much larger dense models while maintaining the efficiency of an 8B scale.

Rankings & Comparison