PaLM 2 (Pathways Language Model 2) is a transformer-based large language model developed by Google Research, announced in May 2023 as the successor to the original PaLM. It was designed to improve upon its predecessor's reasoning, coding, and multilingual capabilities through the use of compute-optimal scaling, improved architectural design, and a more diverse training data mixture.

The model family is categorized into four distinct sizes: Gecko, Otter, Bison, and Unicorn. Gecko is the smallest and is optimized for on-device efficiency, while Unicorn is the largest version designed for complex reasoning tasks. PaLM 2's training data spans over 100 languages and includes a significant proportion of source code and mathematical expressions, enhancing its performance on logic and programming tasks.

Google also developed specialized versions of the model for specific industries. These include Med-PaLM 2, which is fine-tuned for medical knowledge and question answering, and Sec-PaLM, which is optimized for cybersecurity applications. Unlike its predecessor, PaLM 2 emphasizes efficiency, often achieving higher performance than the original 540B PaLM model while utilizing fewer computational resources.

Rankings & Comparison