Artificial Analysis Intelligence Index
Measured May 14, 2026Source
Score
9.1
DeepSeek-V2-Chat is an open-source chat model built on a Mixture-of-Experts (MoE) architecture with 236B total parameters but only 21B activated per token, achieving high efficiency. It supports a 128K context window and demonstrates strong performance in coding, mathematics, and multilingual tasks.
Benchmark history
Score
9.1
Plan availability

Thinking... Make sure you are connected to GitHub server