Product Hunt logo dark
  • Launches
    Coming soon
    Upcoming launches to watch
    Launch archive
    Most-loved launches by the community
    Launch Guide
    Checklists and pro tips for launching
  • Products
  • News
    Newsletter
    The best of Product Hunt, every day
    Stories
    Tech news, interviews, and tips from makers
    Changelog
    New Product Hunt features and releases
  • Forums
    Forums
    Ask questions, find support, and connect
    Streaks
    The most active community members
    Events
    Meet others online and in-person
  • Advertise
Subscribe
Sign in
Ollama

Ollama shoutouts

The easiest way to run large language models locally

•17 reviews•11 shoutouts•

386 followers

Visit website
  • Overview
  • Launches
  • Forums
  • Shoutouts
  • Reviews
  • Team
  • Alternatives
  • More
  • Blog
  • •
  • Newsletter
  • •
  • Questions
  • •
  • Forums
  • •
  • Product Categories
  • •
  • Apps
  • •
  • About
  • •
  • FAQ
  • •
  • Terms
  • •
  • Privacy and Cookies
  • •
  • X.com
  • •
  • Facebook
  • •
  • Instagram
  • •
  • LinkedIn
  • •
  • YouTube
  • •
  • Advertise
© 2025 Product Hunt
Product status
Unclaimed
Forum Threads
Ollamap/ollama
Chris Messina
Chris Messina
Ollama - The easiest way to run large language models locally
•30•

1yr ago

View allStart new thread
Links
github.com/jmorganca/ollama
Makers
Jeff Morgan
All Makers

Maker Shoutouts

Testimonials from top launches

Trending
Anthony Lagrede
Anthony Lagrede
used this to buildZnoteZnote
(211 points)
The best choice for interacting with AI for free while keeping your data locally and securely.
Share
Report
4d ago
cognee
Boris Arzentar
Boris Arzentar
used this to buildcogneecognee
(358 points)
Ollama is the choice for our users wanting local graphs, works best with 32b param models.
Share
Report
1mo ago
AI Renamer
Ozgur Ozer
Ozgur Ozer
used this to buildAI RenamerAI Renamer
(134 points)
It's one of the best apps to run models locally
Share
Report
2mo ago
Apollo AI
Aaron Ng
Aaron Ng
used this to buildApollo AIApollo AI
(237 points)
A great way to try out LLMs on your desktop. Also great for hosting your own AI servers for apps like Apollo.
Share
Report
4mo ago
Lagrange by OrangeCat
Parth Sharma
Parth Sharma
used this to buildLagrange by OrangeCatLagrange by OrangeCat
(143 points)
Because it's the one-stop solution for local inference. At OrangeCat, we aim to give users the highest level of customizability, and there's no better way to do that than to use local LLM inference.
Share
Report
4mo ago
Infinite Convo
Giancarlo Erra
Giancarlo Erra
used this to buildInfinite ConvoInfinite Convo
(174 points)
The best and easier system to experiment with different models.
Share
Report
4mo ago
Focu
Martin
Martin
used this to buildFocuFocu
(112 points)
By embedding Ollama directly in the app my users can communicate with local AI models with ease without having to rely on a 3rd party API
Share
Report
6mo ago
Off-grid LLM over Radio
Pham Binh
Pham Binh
used this to buildRadio LLMOff-grid LLM over Radio
(95 points)
It's super easy to use and fast to develop for.
Share
Report
5mo ago
AnythingLLM
Tim Carambat
Tim Carambat
used this to buildAnythingLLMAnythingLLM
(148 points)
For running and serving LLMs that people can run locally with no frustration - there is few as high quality as Ollama.
Share
Report
9mo ago
AI Renamer
Ozgur Ozer
Ozgur Ozer
used this to buildAI RenamerAI Renamer
(108 points)
It's so easy to build local AI apps with Ollama's APIs
Share
Report
10mo ago
Prompto
Antonio Sanchez
Antonio Sanchez
used this to buildPromptoPrompto
(110 points)
Ollama helped me test the product without having to spend money on token credits with other LLMs.
Share
Report
1yr ago