Google logo
Google
Open Weights

Gemma 3 270M

Released Aug 2025

Intelligence
#440
Coding
#377
Math
#258
Context32K
Parameters270M

Gemma 3 270M is a compact, open-weights large language model (LLM) developed by Google. Released in August 2025 as a hyper-efficient addition to the Gemma 3 family, it is designed specifically for on-device execution and low-latency, task-specific applications. The model is built using the same core research and technology behind the Gemini family, offering a high-quality foundation for developers to fine-tune specialized solutions on local hardware.

The model's architecture consists of 270 million parameters, with 100 million dedicated to transformer blocks and 170 million to a large 256k-token vocabulary. Unlike the larger multimodal models in the Gemma 3 series (such as the 4B and 27B variants), the 270M model is text-only and supports a context window of up to 32,000 tokens. It is optimized for instruction-following and structured text generation, demonstrating strong performance on benchmarks like IFEval for its size class.

Designed for extreme energy efficiency, Gemma 3 270M can operate with a memory footprint as low as 125MB when utilizing INT4 quantization. This efficiency allows the model to run on smartphones, IoT devices, and other resource-constrained environments with minimal battery consumption. It is primarily intended for specialized tasks such as sentiment analysis, entity extraction, and query routing where speed and privacy are prioritized over general-purpose complexity.

Rankings & Comparison