Qwen 2.5-Max
Alibaba's large-scale multilingual model trained on 20T tokens, targeted at enterprise applications.
7
Coding
7.5
Agentic
7.5
Reasoning
7
Math
8
Language
Strengths
- Exceptional multilingual capabilities
- Strong enterprise integration
- Large context window (131K tokens)
- Efficient MoE architecture
Weaknesses
- Less extensive benchmarking data
- More focused on enterprise use cases
- Less extensive multimodal capabilities
- Primarily optimized for business contexts