Upstage AI logo
Upstage AI
Open Weights

Solar Open 100B (Reasoning)

Released Jan 2026

Intelligence
#194
Coding
#282
Context128K
Parameters102B

Solar Open 100B is a large-scale language model developed by Upstage AI as part of South Korea's "Independent AI Foundation Model" project. Built using a Mixture-of-Experts (MoE) architecture, the model contains approximately 102.6 billion total parameters, with 12 billion active per token across 129 experts (128 routed and one shared). It was trained from scratch on a massive dataset of 19.7 trillion tokens, featuring a custom tokenizer with a 196,608-word vocabulary optimized for Korean, English, and Japanese.

Capabilities and Reasoning

The model is specifically engineered for advanced reasoning, instruction-following, and agentic tasks. It utilizes Upstage's SnapPO reinforcement learning framework to enhance multi-step reasoning and mathematical problem-solving capabilities. In benchmark evaluations, Solar Open 100B has demonstrated competitive performance against significantly larger frontier models, particularly in linguistic nuance and cultural context for East Asian languages.

Architecture and Training

Solar Open 100B employs a sparse MoE Transformer design to balance depth of knowledge with inference efficiency. Its training process involved heavy use of synthetic data generation and domain-specific datasets in fields such as law, medicine, and finance. Despite initial industry discussions regarding architectural similarities to other systems, Upstage published a full technical report and development logs to verify the model's independent training from scratch.

Rankings & Comparison