Aime
Measured May 14, 2026Source
Score
0
Mistral
A large-scale Mixture-of-Experts (MoE) model composed of 8 experts, each with 22B parameters, enabling efficient inference through sparse activation. It excels at instruction following, dialogue, and complex reasoning tasks while maintaining a relatively low inference cost.
Benchmark history
Score
0
Score
0.54
Score
0.19
Score
0.15
Score
0.04
Score
0.33
Score
0.54
Score
9.8
Plan availability

Thinking... Make sure you are connected to GitHub server