Models

DeepSeek-V2.5

DeepSeek-V2.5 is an advanced Mixture-of-Experts (MoE) large language model, representing an evolution of the V2 series. It is optimized for strong reasoning and coding capabilities while maintaining efficiency. The model supports a long context window.

CodingReasoningFastCheapLong context
Input / 1M tokens
$0.00
Output / 1M tokens
$0.00
Supported plans
0

Benchmark history

Evaluations

2

Artificial Analysis Intelligence Index

Measured May 14, 2026Source

Score

12.3

Math 500

Measured May 14, 2026Source

Score

0.76

Plan availability

Products and plans that support this model

0
No products or plans have been linked to this model yet.

Discussion

Thinking... Make sure you are connected to GitHub server