Models

MiniMax-M2.5

MiniMax-M2.5 is a multimodal large language model developed by MiniMax, excelling in processing and generating text, images, and audio. It features strong reasoning and coding capabilities, designed for versatile applications requiring integrated understanding across different data types.

MultimodalReasoningCoding
Input / 1M tokens
$0.30
Output / 1M tokens
$1.20
Output tokens/s
89.31
First-token seconds
1.27s
Supported plans
61

Benchmark history

Evaluations

9

TAU2

Measured May 14, 2026Source

Score

0.95

Terminalbench Hard

Measured May 14, 2026Source

Score

0.35

Lcr

Measured May 14, 2026Source

Score

0.66

Ifbench

Measured May 14, 2026Source

Score

0.72

Scicode

Measured May 14, 2026Source

Score

0.43

Hle

Measured May 14, 2026Source

Score

0.19

Gpqa

Measured May 14, 2026Source

Score

0.85

Artificial Analysis Coding Index

Measured May 14, 2026Source

Score

37.4

Artificial Analysis Intelligence Index

Measured May 14, 2026Source

Score

41.9

CODPL speed

Provider ranking

9

腾讯云

tencent_token_plan

109.65
2,499 ms
100%
3
1h
Rank #3

百炼(阿里云)

aliyun_bailian

80.42
1,682 ms
100%
3
1h
Rank #6

Plan availability

Products and plans that support this model

20
Apertis Coding Plan

Apertis Coding Plan

Apertis Coding Plan is a subscription-based AI coding service providing unified access to 30+ AI models (GPT-5.4, Claude Opus 4.6, Gemini 3.1 Pro, and more) through a single API key. Designed for developers using coding agents like Claude Code, Cursor, Cline, and OpenCode, it offers predictable monthly pricing, free prompt caching, auto-failover, and quota-based billing across OpenAI, Anthropic, Google, and other providers.

OpenCode Go

OpenCode Go

OpenCode Go is a low-cost monthly subscription that provides reliable access to a curated set of powerful open-source coding models, such as GLM-5.1, Kimi K2.6, and DeepSeek V4 Pro, for use with AI coding agents like OpenCode. Priced at $10/month with a $5 first month, it offers generous request limits to support developers.

Discussion

Thinking... Make sure you are connected to GitHub server