Product Hunt logo dark
  • Launches
    Coming soon
    Upcoming launches to watch
    Launch archive
    Most-loved launches by the community
    Launch Guide
    Checklists and pro tips for launching
  • Products
  • News
    Newsletter
    The best of Product Hunt, every day
    Stories
    Tech news, interviews, and tips from makers
    Changelog
    New Product Hunt features and releases
  • Forums
    Forums
    Ask questions, find support, and connect
    Streaks
    The most active community members
    Events
    Meet others online and in-person
  • Advertise
Subscribe
Sign in
Subscribe
Sign in
LM Studio

LM Studio

Discover, download, and run local LLMs

5.0
•1 review•

193 followers

Discover, download, and run local LLMs

5.0
•1 review•

193 followers

Visit website
LLMs
🤖 • Run LLMs on your laptop, entirely offline 📚 • Chat with your local documents 👾 • Use models through the in-app Chat UI or an OpenAI compatible local server
  • Overview
  • Launches1
  • Reviews1
  • Alternatives
  • Forum
  • Team
  • Awards
  • More
Company Info
lmstudio.ai
LM Studio Info
Launched in 2025View 1 launch
Forum
p/lm-studio-2
  • Blog
  • •
  • Newsletter
  • •
  • Questions
  • •
  • Forums
  • •
  • Product Categories
  • •
  • Apps
  • •
  • About
  • •
  • FAQ
  • •
  • Terms
  • •
  • Privacy and Cookies
  • •
  • X.com
  • •
  • Facebook
  • •
  • Instagram
  • •
  • LinkedIn
  • •
  • YouTube
  • •
  • Advertise
© 2025 Product Hunt
SocialX

Similar Products

Jan
Jan
On-device ChatGPT alternative that runs 100% offline
4.0(3 reviews)
LLMsCompliance software
Chatbox
Chatbox
Better UI & Desktop App for ChatGPT, Claude and other LLMs.
4.7(43 reviews)
AI Generative ArtAI Chatbots
LM Studio gallery image
LM Studio gallery image
LM Studio gallery image
LM Studio gallery image
Free
Launch tags:
Mac•Developer Tools•Artificial Intelligence
Launch Team / Built With
Chris MessinaYagil Burowski
Hugging Face
Mistral AI
Llama

What do you think? …

Chris Messina
Chris Messina
Hunter
📌
Want to get on the DeepSeek hype train but don't want your data to be sent to China? Cool! You can run DeepSeek R1 models locally with LM Studio if you have enough RAM. Here's how to do it:
  • 1. Download LM Studio for your operating system from here.
  • 2. Click the 🔎 icon on the sidebar and search for "DeepSeek"
  • 3. Pick an option that will fit on your system. For example, if you have 16GB of RAM, you can run the 7B or 8B parameter distilled models. If you have ~192GB+ of RAM, you can run the full 671B parameter model.
  • 4. Load the model in the chat, and start asking questions!
Of course, you can also run other models locally using LM Studio, like @Llama 3.2, @Mistral AI, Phi, Gemma, @DeepSeek AI, and Qwen 2.5.
Report
6mo ago
Yuvraj Sonawane
Yuvraj Sonawane
@chrismessina I have been using it since some time and loved its interface. Also thank you for pointing out only 192GB or more would be sufficient for full r1.
Report
6mo ago
Tasos V
Tasos V
chatWise

chatWise

Nice work folks! Just curious, what is the main difference between using LM studio and running things locally using Ollama for example? Is the offline part? Or am I missing something else?
Report
6mo ago
Chris Messina
Chris Messina
Hunter
@cryptosymposium a big difference is having a UI to interact with the models, vs terminal.
Report
6mo ago
Tasos V
Tasos V
chatWise

chatWise

@chrismessina thanks for the reply Chris! Ok yeah that makes sense then, a GUI is always more intuitive. Will definitely give it a go !
Report
6mo ago
Max Comperatore
Max Comperatore
is this like web ui?
Report
6mo ago
Chris Messina
Chris Messina
Hunter
@maxcomx which web UI?
Report
6mo ago
Ambassador
Ambassador
Ambassador
AutoForm
AutoForm — Automate the busywork from your files and your tools.
Automate the busywork from your files and your tools.
Promoted

Do you use LM Studio?

Forum Threads

LM Studiop/lm-studio-2Chris Messina
Chris Messina
•

5d ago

Run OpenAI's gpt-oss locally in LM Studio

Want to run @OpenAI's open models locally? Now you can with LM Studio!

View all
5.0
Based on 1 review
Review LM Studio?
Reviews
Helpful
Sarang N
Sarang N
•5 reviews
Absolutely beautiful User Interface. It's super easy to setup and start using. I've used ollama, but since the UI is a separate project, it's a bit difficult to setup. LM Studio also have very good collection of models available compared to Ollama. My favourite thing about LMS is it shows the model size upfront and I don't have to dig through to find it. CUDA runtime is also a great plus.
Report
5mo ago