Jamba 1.5 Large is an open-weights large language model developed by AI21 Labs, built on a hybrid architecture that integrates the Mamba Structured State Space Model (SSM) with Transformer layers. This design enables the model to handle massive context windows with higher efficiency and lower latency than traditional Transformer-only architectures.
The model employs a Mixture-of-Experts (MoE) framework, featuring a total of 398 billion parameters. During inference, only 94 billion parameters are active per token, allowing for high-capacity reasoning without the proportional computational cost of a dense model of the same size. The model has a knowledge cutoff date of March 5, 2024.
Jamba 1.5 Large supports an extensive 256k token context window, facilitating the analysis of long-form documents, entire code repositories, and large datasets. It is natively capable of function calling and structured JSON output, making it suitable for agentic workflows and complex data extraction tasks.
The model is multilingual, with official support for English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew. It is released under the Jamba Open Model License, which permits research and commercial applications.