Aime
Measured May 14, 2026Source
Score
0.05
Jamba 1.5 Large is a hybrid Transformer-Mamba model designed for exceptional long-context performance, supporting up to 256K tokens. It offers a strong balance of speed, quality, and cost-efficiency, and includes multimodal capabilities for processing both text and images.
Benchmark history
Score
0.05
Score
0.61
Score
0.16
Score
0.14
Score
0.04
Score
0.43
Score
0.57
Score
10.7
Plan availability

Thinking... Make sure you are connected to GitHub server