AI21 Labs logo
AI21 Labs
Open Weights

Jamba 1.7 Large

Released Jul 2025

Intelligence
#368
Coding
#308
Math
#258
Context256K
Parameters398B

Jamba 1.7 Large is a large language model developed by AI21 Labs that utilizes a hybrid SSM-Transformer architecture. This design combines the long-context efficiency of Mamba State Space Models with the high-performance reasoning of Transformer layers. The model is characterized by its 256K token context window, which allows it to process extremely long sequences of data and complex multi-step instructions with high inference speed.

The model is built using a Mixture-of-Experts (MoE) structure, featuring 398 billion total parameters with 94 billion active during any single forward pass. This architectural choice is designed to balance high-capacity intelligence with improved throughput and lower computational latency compared to dense models of similar size. It provides native support for multiple languages, including English, Spanish, French, German, Portuguese, Italian, Dutch, Arabic, and Hebrew.

Jamba 1.7 Large introduces advancements in grounding and instruction-following capabilities. It is designed to be contextually faithful and highly steerable, making it suitable for specialized enterprise applications such as financial research, legal analysis, and advanced retrieval-augmented generation (RAG) systems. The model is released under the Jamba Open Model License, which permits both research and commercial use.

Rankings & Comparison