Devstral 2 is a large-scale, coding-centric language model developed by Mistral AI and released in December 2025. It is designed for production-grade software engineering workflows and agentic tasks, such as multi-file reasoning, repository-wide automation, and autonomous codebase exploration. Unlike many contemporaneous frontier models that utilize Mixture-of-Experts (MoE) architectures, Devstral 2 employs a dense transformer architecture with approximately 123 billion parameters.
The model is optimized for agentic coding, supporting advanced tool-calling capabilities and maintaining architecture-level context across complex projects. It achieves a 72.2% score on the SWE-bench Verified benchmark, placing it at the frontier of open-weight coding models. Devstral 2 is also multimodal, allowing it to process image inputs such as architecture diagrams, user interface screenshots, and visual error traces alongside text and code.
Devstral 2 supports a context window of 256,000 tokens, enabling the analysis of extensive documentation and large source code repositories. It was released alongside a smaller variant, Devstral Small 2 (24B), and is distributed under a modified MIT license for the 123B variant.