Meta logo
Meta
Open Weights

Llama 4 Maverick

Released Apr 2025

Llama 4 Maverick is a large-scale, natively multimodal language model developed by Meta as part of the Llama 4 family. Released in April 2025, it utilizes a Mixture-of-Experts (MoE) architecture, comprising approximately 400 billion total parameters with 17 billion parameters active per token during inference. This design incorporates 128 specialized experts to balance high-quality performance with improved computational efficiency.

The model features native multimodality through an "early fusion" mechanism, allowing it to process text and image inputs simultaneously from the initial layers of the network. Optimized for general-purpose tasks such as complex reasoning, coding, and multilingual interaction, it supports a context window of 1,048,576 tokens (1M), enabling the processing of extensive documents and long-form conversational histories.

Training for Maverick involved approximately 22 trillion tokens of curated public, licensed, and platform-specific data, with a knowledge cutoff of August 2024. The model was developed through a distillation process from the larger Llama 4 Behemoth foundation model and is distributed under the Llama 4 Community License for research and commercial use.

Rankings & Comparison