01 AI logo
01 AI
Open Weights

yi-1.5-34b-chat

Released May 2024

Arena AI
#210
Parameters34B

Yi-1.5-34B-Chat is an upgraded version of the original Yi series large language model developed by 01.AI. It utilizes a decoder-only Transformer architecture and contains 34 billion parameters. This model is specifically fine-tuned for conversational use cases, offering improved instruction-following and dialogue capabilities over its predecessor.

The Yi-1.5 series was developed by continuously pre-training the base Yi model on an additional 500 billion high-quality tokens. It incorporates advanced alignment techniques, including Supervised Fine-Tuning (SFT) and Direct Preference Optimization (DPO), to enhance its performance in logical reasoning, mathematical problem-solving, and code generation.

Technical Capabilities

Compared to the first generation, Yi-1.5-34B-Chat demonstrates stronger proficiency in coding and complex reasoning while maintaining high performance in language understanding and reading comprehension. The standard chat model supports a base context length of 4K tokens, though the series includes variants for 16K and 32K context windows. It is released under the Apache 2.0 license, which allows for broad use and commercial application.

Rankings & Comparison