TAU2
Measured May 14, 2026Source
Score
0.53
NVIDIA
Nemotron Cascade 2 30B A3B is a large language model from NVIDIA's Nemotron family, featuring a Mixture-of-Experts (MoE) architecture with 30 billion total parameters and 3 billion active parameters per token. This design enables efficient, high-speed inference while maintaining strong performance on coding and reasoning tasks.
Benchmark history
Score
0.53
Score
0.21
Score
0.34
Score
0.8
Score
0.35
Score
0.11
Score
0.76
Score
25.8
Score
28.4
Plan availability

Thinking... Make sure you are connected to GitHub server