TII UAE logo
TII UAE
Open Weights

Falcon-H1R-7B

Released Jan 2026

Intelligence
#271
Coding
#289
Math
#67
Context256K
Parameters7B

Falcon-H1R-7B is a 7-billion parameter language model developed by the Technology Innovation Institute (TII) in Abu Dhabi. It is a reasoning-specialized model designed to provide advanced logic, mathematical, and coding capabilities in a compact format. The model employs a hybrid architecture that integrates Transformer layers with Mamba2 state-space components, which optimizes both processing speed and memory efficiency for long-sequence tasks. Falcon-H1R-7B was developed through a two-stage training pipeline consisting of cold-start supervised fine-tuning with long reasoning traces and reinforcement learning via the Group Relative Policy Optimization (GRPO) algorithm. This methodology allows the model to produce detailed chain-of-thought reasoning traces to resolve multi-step problems. Additionally, it features a confidence-aware filtering system called DeepConf and supports an extensive context window of 256,000 tokens.

Rankings & Comparison