
Run Llama 2 and other models on macOS, with Windows and Linux coming soon. Customize and create your own.
Ollama is the best way to run LLMs locally and the easiest way to test, build, and deploy new models. It has opened my eyes to the world of LLMs, a fantastic product.
Very easy and powerful to run and customize local LLMs, and to integrate (Langchain or LlamaIndex).