TAU2
Score
0.94
Qwen3.5 122B A10B is a large-scale Mixture-of-Experts (MoE) model from Alibaba's Qwen series, optimized for complex reasoning tasks. It features a 122 billion parameter architecture with 10 billion active parameters, balancing high performance with computational efficiency. The model supports an extremely long context window and excels in code generation and logical analysis.
Benchmark history
Score
0.94
Score
0.31
Score
0.67
Score
0.76
Score
0.42
Score
0.23
Score
0.86
Score
34.7
Score
41.6
Plan availability
Apertis Coding Plan is a subscription-based AI coding service providing unified access to 30+ AI models (GPT-5.4, Claude Opus 4.6, Gemini 3.1 Pro, and more) through a single API key. Designed for developers using coding agents like Claude Code, Cursor, Cline, and OpenCode, it offers predictable monthly pricing, free prompt caching, auto-failover, and quota-based billing across OpenAI, Anthropic, Google, and other providers.

Thinking... Make sure you are connected to GitHub server