Ministral 3 14B is a dense, multimodal language model released by Mistral AI in December 2025 as part of the Ministral 3 family. It is designed to provide high-performance intelligence for edge and local computing environments, offering capabilities comparable to larger models while maintaining a footprint suitable for private deployments and resource-constrained hardware.
The model's architecture combines a 13.5 billion parameter language core with a 0.4 billion parameter vision encoder, allowing for unified reasoning across text and image inputs. As a dense transformer model, it provides consistent performance and is optimized for low-latency inference on single-GPU setups and edge devices.
Key features include a large context window of 256,000 tokens and advanced agentic capabilities, such as native function calling and structured JSON output. These features enable the model to handle long-horizon tasks, complex document analysis, and autonomous tool-use. It is highly multilingual, supporting over 40 languages including English, French, Spanish, German, Chinese, and Japanese.
Ministral 3 14B was released under the Apache 2.0 license, supporting open-weight access for both commercial and non-commercial applications. The family includes base, instruct, and reasoning variants, with the 14B reasoning version specifically optimized for STEM, coding, and mathematical logic tasks.