Devstral Small is a 24-billion parameter language model specialized for agentic coding applications. Released in May 2025 by Mistral AI, it was developed in collaboration with All Hands AI to solve complex software engineering tasks, including codebase exploration, bug fixing, and multi-file editing. It was designed as a specialized, open-weight alternative for developers requiring high-performance coding capabilities on local hardware.
The model is based on a dense Transformer architecture fine-tuned from Mistral Small 3.1. During the specialization process, the vision encoder from the base model was removed to prioritize text-based programming and reasoning capabilities. It utilizes the Tekken tokenizer with a 131k vocabulary and is optimized to run on consumer-grade hardware, such as a single high-end GPU or a workstation with 32GB of RAM.
Devstral Small supports a 128,000-token context window, enabling it to reason over large segments of project-specific context. It is specifically optimized for use with agentic scaffolds like OpenHands and supports both Mistral-style tool calling and XML output formats. At its launch, the model achieved a score of 46.8% on the SWE-bench Verified benchmark, outperforming many significantly larger open-source alternatives.
The model is distributed under the Apache 2.0 license, permitting broad commercial and research use. It is accessible through official APIs and as open weights for local deployment.