Roche

would you prefer an local llm software?

by

there are many online llm ui, like monica,maxai would you prefer local llm chat with pdf/chat with something you like? if yes, which model or ability you want?

Add a comment

Replies

Best
Ruben Boonzaaijer
I use ollama to host my local models like mistral and llama2