DeepSeek-V2.5
1 evaluationsDeepSeek
DeepSeek-V2.5 is an advanced Mixture-of-Experts (MoE) large language model, representing an evolution of the V2 series. It is optimized for strong reasoning and coding capabilities while maintaining efficiency. The model supports a long context window.
Input / 1M tokens
$0.00
Output / 1M tokens
$0.00


