Models

DeepSeek-V2.5 (Dec '24)

DeepSeek-V2.5 is a high-performance Mixture-of-Experts (MoE) model optimized for coding and reasoning tasks. It offers a strong balance of capability and cost-efficiency, supporting long context windows for complex applications.

CodingReasoningFastCheapLong context
Input / 1M tokens
$0.00
Output / 1M tokens
$0.00
Supported plans
0

Benchmark history

Evaluations

2

Math 500

Measured May 14, 2026Source

Score

0.76

Artificial Analysis Intelligence Index

Measured May 14, 2026Source

Score

12.5

Plan availability

Products and plans that support this model

0
No products or plans have been linked to this model yet.

Discussion

Thinking... Make sure you are connected to GitHub server