Qwen2.5-Max

Qwen2.5-Max

Large language model series developed by Alibaba Cloud

84 followers

Qwen2.5-Max is a large-scale AI model using a mixture-of-experts (MoE) architecture. With extensive pre-training and fine-tuning, it delivers strong performance in benchmarks like Arena Hard, LiveBench, and GPQA-Diamond, competing with models like DeepSeek V3.
Qwen2.5-Max gallery image
Qwen2.5-Max gallery image
Qwen2.5-Max gallery image
Qwen2.5-Max gallery image
Free Options
Launch Team

What do you think? …

Chris Messina
That Alibaba launched Qwen 2.5-Max on the first day of the Lunar New Year signals an urgent response to DeepSeek's recent AI breakthroughs. This large-scale Mixture-of-Expert (MoE) model has been pre-trained on over 20 trillion tokens (!!) and enhanced through Supervised Fine-Tuning (SFT) and Reinforcement Learning from Human Feedback (RLHF).
André J
How does deepseek compare against Qwen regarding price?