Sonar Pro is an advanced language model developed by Perplexity, optimized for search-centric information retrieval and high-accuracy response generation. It serves as a high-performance, non-reasoning model within the Sonar family, designed to provide concise and factual answers grounded in real-time internet data.
Technically, the model is built upon the Llama 3.3 70B architecture. Perplexity applies additional training to the base model to enhance its factuality and readability, specifically for search-augmented generation (RAG). It is distinguished by its ability to synthesize a higher volume of search results compared to standard models, typically offering twice as many citations per response to improve verifiability.
The model features a 200,000-token context window, enabling it to process extensive source material and maintain coherence in long, multi-turn dialogues. While it does not utilize the explicit chain-of-thought reasoning found in the "Reasoning" variants of the Sonar line, it is frequently employed in agentic search workflows for complex research tasks that require deep content synthesis and multi-step information retrieval.