Magistral Small 1 is an open-weight reasoning language model developed by Mistral AI, first released in June 2025. It is the compact variant of the Magistral family, representing Mistral's first dedicated line of models optimized for multi-step reasoning, mathematical logic, and complex problem-solving. Built upon the Mistral Small architectural foundation, it was trained using reinforcement learning to generate an internal chain-of-thought before providing a final response.
The model contains approximately 24 billion parameters and is designed for high efficiency, capable of running on consumer-grade hardware such as a single high-end GPU or a 32GB RAM laptop. It utilizes specialized [THINK] and [/THINK] tokens to encapsulate its reasoning traces, providing a transparent "inner monologue" that can be parsed or hidden by users depending on the application.
Magistral Small 1 supports a 128,000-token context window and is highly multilingual, covering dozens of languages including English, French, German, Spanish, Chinese, and Arabic. While early versions focused strictly on text reasoning, later updates within the 1.x series introduced multimodal capabilities through an integrated vision encoder, allowing for visual reasoning tasks. The model is released under the Apache 2.0 license, supporting both commercial and non-commercial open-source usage.