AI21 Labs logo
AI21 Labs
Open Weights

Jamba 1.7 Mini

Released Jul 2025

Intelligence
#431
Coding
#347
Math
#264
Context258K
Parameters52B (12B active)

Jamba 1.7 Mini is a hybrid Large Language Model (LLM) developed by AI21 Labs, based on the Joint Attention and Mamba (Jamba) architecture. This model architecture integrates State Space Model (SSM) layers (Mamba) with traditional Transformer layers to balance computational efficiency and performance, particularly for long-context tasks. It is implemented as a Mixture-of-Experts (MoE) system, featuring 52 billion total parameters and 12 billion active parameters.\n\nWith a context window of 256,000 tokens, Jamba 1.7 Mini is designed to process and reason over large volumes of information, such as extensive document sets or long codebases. The hybrid design allows the model to achieve high throughput and reduced memory consumption compared to traditional Transformer-only architectures of similar scale. This version introduces specific improvements in grounding, steerability, and instruction-following over previous iterations in the Jamba family.\n\nThe model weights are released under the Jamba Open Model License, permitting researchers and developers to use the model for both experimental and commercial purposes. It is optimized for enterprise use cases where accuracy and efficiency are critical, such as financial analysis, legal review, and automated document summarization.

Rankings & Comparison