Product Hunt logo dark
  • Launches
    Coming soon
    Upcoming launches to watch
    Launch archive
    Most-loved launches by the community
    Launch Guide
    Checklists and pro tips for launching
  • Products
  • News
    Newsletter
    The best of Product Hunt, every day
    Stories
    Tech news, interviews, and tips from makers
    Changelog
    New Product Hunt features and releases
  • Forums
    Forums
    Ask questions, find support, and connect
    Streaks
    The most active community members
    Events
    Meet others online and in-person
  • Advertise
Subscribe
Sign in
Subscribe
Sign in
Qwen 1.5 MoE

Qwen 1.5 MoE

Highly efficient mixture-of-expert (MoE) model from Alibaba

5.0
•1 review•

51 followers

Highly efficient mixture-of-expert (MoE) model from Alibaba

5.0
•1 review•

51 followers

Visit website
Qwen1.5-MoE-A2.7B is a small mixture-of-expert (MoE) model with only 2.7 billion activated parameters yet matches the performance of state-of-the-art 7B models like Mistral 7B and Qwen1.5-7B.
  • Overview
  • Launches1
  • Reviews1
  • Alternatives
  • Team
  • More
Company Info
huggingface.co/Qwen
Qwen 1.5 MoE Info
Launched in 2024View 1 launch
Forum
p/qwen-1-5-moe
  • Blog
  • •
  • Newsletter
  • •
  • Questions
  • •
  • Forums
  • •
  • Product Categories
  • •
  • Apps
  • •
  • About
  • •
  • FAQ
  • •
  • Terms
  • •
  • Privacy and Cookies
  • •
  • X.com
  • •
  • Facebook
  • •
  • Instagram
  • •
  • LinkedIn
  • •
  • YouTube
  • •
  • Advertise
© 2025 Product Hunt
Qwen 1.5 MoE gallery image
Qwen 1.5 MoE gallery image
AutoForm
AutoForm — Automate the busywork from your files and your tools.
Automate the busywork from your files and your tools.
Promoted

Do you use Qwen 1.5 MoE?

5.0
Based on 1 review
Review Qwen 1.5 MoE?
Reviews
Helpful
Arch
Salman Paracha
Salman Paracha
used Qwen 1.5 MoE to buildArchArch
(262 points)
Highly performant base models that can be used for task-specific training. Such as the function calling experience built into Arch
Report
10mo ago