Mistral logo
Mistral
Open Weights

Mistral Small 3.1

Released Mar 2025

Intelligence
#299
Coding
#231
Math
#252
Context128K
Parameters24B

Mistral Small 3.1 is a 24-billion parameter multimodal language model developed by Mistral AI, released in March 2025. Serving as a successor to Mistral Small 3, this version introduces native vision and image understanding capabilities while significantly expanding the context window to 128,000 tokens. The model is released under the Apache 2.0 license, supporting both commercial and non-commercial applications.

Architecturally, Mistral Small 3.1 is optimized for high efficiency and low-latency performance, targeting enterprise use cases such as long-document analysis, visual inspection, and complex instruction following. It utilizes the Tekken tokenizer with a 131k vocabulary size and maintains high performance across multilingual and multimodal benchmarks, rivaling larger proprietary models in its weight class.

Often referred to as the 2503 release in reference to its March 2025 launch, the model is designed to be accessible for localized deployment on single-GPU hardware configurations. It features native function calling and enhanced reasoning capabilities, making it a foundational model for agentic workflows and sophisticated conversational AI.

Rankings & Comparison