Mistral logo
Mistral
Open Weights

Mistral Saba

Released Feb 2025

Intelligence
#352
Context32K
Parameters24B

Mistral Saba is a specialized language model developed by Mistral AI, serving as the company's first model custom-trained for specific regional geographies. With 24 billion parameters, it is specifically optimized for the linguistic and cultural nuances of the Middle East and South Asia, focusing on languages such as Arabic, Tamil, and Malayalam. The model is designed to provide high-speed inference on single-GPU systems, capable of processing text at speeds exceeding 150 tokens per second.

Despite its compact size compared to general-purpose frontier models, Mistral Saba demonstrates strong performance on regional benchmarks, often outperforming models with significantly higher parameter counts in Arabic conversational accuracy and cultural context. It utilizes a 32,768-token context window, allowing for the processing of relatively long documents and complex localized interactions.

Architecture-wise, Saba follows the efficient design principles seen in the Mistral Small series, balancing performance with low operational costs. It is available both as an API and as open weights for local deployment, catering to organizations with specific data sovereignty and security requirements in the Middle Eastern and South Asian markets.

Rankings & Comparison