TAU2
Measured May 14, 2026Source
Score
0
Allen Institute for AI
OLMo 2 32B is a 32-billion parameter open-source language model from the Allen Institute for AI. It is part of the fully open OLMo 2 series, designed to provide a transparent and reproducible foundation for research and development.
Benchmark history
Score
0
Score
0
Score
0
Score
0.38
Score
0.03
Score
0.08
Score
0.07
Score
0.04
Score
0.33
Score
0.51
Score
3.3
Score
2.7
Score
10.6
Plan availability

Thinking... Make sure you are connected to GitHub server