Singapore
Canopy Wave
Canopy Wave is an AI inference platform specializing in open-source model hosting and serverless inference services. The platform provides optimized API access to leading open models including Kimi K2.6, MiMo-V2.5, DeepSeek-V4-Flash, GLM-5.1, and MiniMax-M2.5, with pricing based on input/output tokens and context caching. Beyond inference, Canopy Wave offers GPU cloud infrastructure featuring NVIDIA GB200 NVL72, HGX B200, H200, and H100 clusters for AI model training and deployment. The company positions itself as an enterprise-grade provider with private cloud hosting, full data isolation, zero data retention, and real-time monitoring capabilities.
Region
Singapore
Updated
May 10, 2026
Product coverage
Products from this provider
API gateway
Canopy Wave Unlimited Coding Plan
Canopy Wave Unlimited Token Plan provides unlimited access to optimized open models like Kimi K2.6 and MiniMax M2.5 with high-speed token quotas, offering significant cost savings for AI inference workflows.
Plans
4
Models
6
Updated
May 11, 2026
Coding plan
Canopy Wave Coding Plan
Canopy Wave Coding Plan is a subscription service for developers providing access to multiple AI coding models such as GLM-5.1, Kimi K2.6, MiniMax M2.5, and MiMo-V2.5 via an OpenAI-compatible API, optimized for coding workflows and compatible with mainstream AI coding tools.
Plans
2
Models
4
Updated
May 11, 2026
Model coverage
Models from this provider
Discussion

Thinking... Make sure you are connected to GitHub server

