Models

DeepSeek-V2-Chat

DeepSeek-V2-Chat is an open-source chat model built on a Mixture-of-Experts (MoE) architecture with 236B total parameters but only 21B activated per token, achieving high efficiency. It supports a 128K context window and demonstrates strong performance in coding, mathematics, and multilingual tasks.

Long contextCodingReasoningCheap
Input / 1M tokens
$0.00
Output / 1M tokens
$0.00
Supported plans
0

Benchmark history

Evaluations

1

Artificial Analysis Intelligence Index

Measured May 14, 2026Source

Score

9.1

Plan availability

Products and plans that support this model

0
No products or plans have been linked to this model yet.

Discussion

Thinking... Make sure you are connected to GitHub server