Models

Mistral

Mixtral 8x7B Instruct

An open-source instruction-tuned model based on a Mixture-of-Experts (MoE) architecture. It excels at complex reasoning tasks, supports long-context processing, and offers fast response times with high cost-efficiency.

ReasoningFastCheapLong context
Input / 1M tokens
$0.45
Output / 1M tokens
$0.70
Supported plans
0

Benchmark history

Evaluations

8

Aime

Measured May 14, 2026Source

Score

0

Math 500

Measured May 14, 2026Source

Score

0.3

Scicode

Measured May 14, 2026Source

Score

0.03

Livecodebench

Measured May 14, 2026Source

Score

0.07

Hle

Measured May 14, 2026Source

Score

0.05

Gpqa

Measured May 14, 2026Source

Score

0.29

Mmlu Pro

Measured May 14, 2026Source

Score

0.39

Artificial Analysis Intelligence Index

Measured May 14, 2026Source

Score

7.7

Plan availability

Products and plans that support this model

0
No products or plans have been linked to this model yet.

Discussion

Thinking... Make sure you are connected to GitHub server