All activity
marcusmartins
left a comment
Recently I a long flight and having ollama (with llama2) locally really helped me prototype some quick changes to our product without having to rely on spotty plane wifi.
Ollama
The easiest way to run large language models locally