I'm working on a tool to launch AIs locally as easily as launching spotify - I'd love your feedback!
Hello everyone !
I recently posted in this forum about creating the easiest tool to launch AI models locally and if you're interested ?
I've had some return from some of y'all and so as not to go in the wrong direction, I've come to ask your opinion once again.
The existing tools (LM Studio, Ollama, etc.) are powerful... but not necessarily easy to use for everyone (choice of model, complicated interface, lots of options, etc.).
I'd like to make it as simple as an iPhone app.
I'm trying to understand :
- Is this a real problem for you?
- What's stopping you from using these tools today ?
- What would your ideal solution look like ?
I look forward to hearing from you in the comments, and if you'd like to become a beta tester later on, I'll be waiting for you in DM !
Replies
The real problem is data accuracy, hallucinations, training the bot in the chat to grasp the theme and the citations (mostly they tend to be 404s and require additional time to be checked manually)
I hope this helps?
@tania_j Hello Tania, thank you for your comment. Do you mean the problem when using online models, like ChatGPT, Claude, etc ?
@enzo_ghillain yeap. I’ve used plenty and these are the common issues I experienced.
What's the gap that is not filled by ChatGPT/Grok/Gemini/... apps that you are thinking of filling?
Launching Ollama or LM Studio is not a problem.
My issue is with integrating memory for a local LLM. Maybe instead of focusing on another program like Ollama or LM Studio, you could dedicate time to creating a program that serves as memory for LLM?
You know… instead of another LLM program, make a program that functions as memory.
So you install memory for LLM, connect it to Ollama or LM Studio, and you have memory for your LLM.