AI21 Labs logo
AI21 Labs
Open Weights

Jamba Reasoning 3B

Released Oct 2025

Intelligence
#397
Coding
#354
Math
#231
Context262K
Parameters3B

Jamba Reasoning 3B is a compact large language model developed by AI21 Labs, specifically designed for complex, multi-step reasoning tasks on consumer hardware. It utilizes a hybrid SSM-Transformer architecture, which integrates 26 Mamba State Space Model (SSM) layers with 2 Transformer attention layers. This design is optimized for high throughput and memory efficiency, enabling the model to process extremely long sequences with significantly lower computational overhead than traditional transformer-only models.

The model is specialized for logic, mathematics, and programming, leveraging chain-of-thought training techniques to improve its problem-solving accuracy. One of its most distinctive features is its 256,000-token context window, which allows it to handle extensive documents, large codebases, or complex data analysis tasks in a single pass. AI21 Labs reported that the model achieves inference speeds up to five times faster than similarly sized dense models during long-context processing.

Released under the Apache 2.0 license, Jamba Reasoning 3B is intended for on-device AI applications, including smartphones and laptops. Its open-weight availability facilitates private, cost-effective deployment for developers seeking to implement agentic workflows and specialized reasoning systems without relying on cloud-based APIs.

Rankings & Comparison