Launches
Coming soon
Upcoming launches to watch
Launch archive
Most-loved launches by the community
Launch Guide
Checklists and pro tips for launching
Products
News
Newsletter
The best of Product Hunt, every day
Stories
Tech news, interviews, and tips from makers
Changelog
New Product Hunt features and releases
Forums
Forums
Ask questions, find support, and connect
Streaks
The most active community members
Events
Meet others online and in-person
Advertise
Subscribe
Sign in
Ollama shoutouts
The easiest way to run large language models locally
•
17 reviews
•
11 shoutouts
•
386 followers
Visit website
Follow
Overview
Launches
Forums
Shoutouts
Reviews
Team
Alternatives
More
Blog
•
Newsletter
•
Questions
•
Forums
•
Product Categories
•
Apps
•
About
•
FAQ
•
Terms
•
Privacy and Cookies
•
X.com
•
Facebook
•
Instagram
•
LinkedIn
•
YouTube
•
Advertise
© 2025 Product Hunt
Maker Shoutouts
Testimonials from top launches
Trending
Anthony Lagrede
used this to build
Znote
(211 points)
The best choice for interacting with AI for free while keeping your data locally and securely.
Helpful
Share
Report
4d ago
Boris Arzentar
used this to build
cognee
(358 points)
Ollama is the choice for our users wanting local graphs, works best with 32b param models.
Helpful
Share
Report
1mo ago
Ozgur Ozer
used this to build
AI Renamer
(134 points)
It's one of the best apps to run models locally
Helpful
Share
Report
2mo ago
Aaron Ng
used this to build
Apollo AI
(237 points)
A great way to try out LLMs on your desktop. Also great for hosting your own AI servers for apps like Apollo.
Helpful
Share
Report
4mo ago
Parth Sharma
used this to build
Lagrange by OrangeCat
(143 points)
Because it's the one-stop solution for local inference. At OrangeCat, we aim to give users the highest level of customizability, and there's no better way to do that than to use local LLM inference.
Helpful
Share
Report
4mo ago
Giancarlo Erra
used this to build
Infinite Convo
(174 points)
The best and easier system to experiment with different models.
Helpful
Share
Report
4mo ago
Martin
used this to build
Focu
(112 points)
By embedding Ollama directly in the app my users can communicate with local AI models with ease without having to rely on a 3rd party API
Helpful
Share
Report
6mo ago
Pham Binh
used this to build
Off-grid LLM over Radio
(95 points)
It's super easy to use and fast to develop for.
Helpful
Share
Report
5mo ago
Tim Carambat
used this to build
AnythingLLM
(148 points)
For running and serving LLMs that people can run locally with no frustration - there is few as high quality as Ollama.
Helpful
Share
Report
9mo ago
Ozgur Ozer
used this to build
AI Renamer
(108 points)
It's so easy to build local AI apps with Ollama's APIs
Helpful
Share
Report
10mo ago
Antonio Sanchez
used this to build
Prompto
(110 points)
Ollama helped me test the product without having to spend money on token credits with other LLMs.
Helpful
Share
Report
1yr ago