Qwen-Image is a new 20B open-source image foundation model by the Qwen team. It excels at complex text rendering (especially Chinese) and precise image editing, while also delivering strong general image generation. Available now in Qwen Chat.
Qwen3-235B-A22B-Thinking-2507 is a powerful open-source MoE model (22B active) built for deep reasoning. It achieves SOTA results on agentic tasks, supports a 256K context, and is available on Hugging Face and via API.
Qwen3-Coder is a new 480B MoE open model (35B active) by the Qwen team, built for agentic coding. It achieves SOTA results on benchmarks like SWE-bench, supports up to 1M context, and comes with an open-source CLI tool, Qwen Code.
Qwen3 is the newest family of open-weight LLMs (0.6B to 235B MoE) from Alibaba. Features switchable "Thinking Mode" for reasoning vs. speed. Strong performance on code/math. Multilingual.