Alibaba logo
Alibaba
Open Weights

Qwen3 30B A3B 2507 Instruct

Released Jul 2025

Intelligence
#287
Coding
#222
Math
#104
Context262K
Parameters30.5B

Qwen3-30B-A3B-Instruct-2507 is a 30.5 billion parameter large language model developed by Alibaba Cloud, utilizing a Mixture-of-Experts (MoE) architecture. Released as part of a significant update in July 2025, this variant is characterized by its high efficiency, activating only 3.3 billion parameters during inference. It is designed as a "non-thinking" model, optimized for direct instruction following without the explicit internal reasoning chains found in specialized "thinking" versions.

The model features a native context window of 262,144 tokens, which can be extended to support up to 1 million tokens. It demonstrates substantial improvements in general capabilities over previous iterations, particularly in logical reasoning, mathematics, science, and multilingual comprehension across 119 languages. It is specifically engineered for agentic workflows, offering robust tool-calling capabilities and enhanced alignment with human preferences in open-ended tasks.

Built to be accessible for the developer community, Qwen3-30B-A3B-Instruct-2507 is released under the Apache 2.0 license. It maintains competitive performance on global benchmarks like MMLU-Pro and GPQA while remaining efficient enough for local deployment on consumer-grade hardware through various quantization formats.

Rankings & Comparison