What models are actually recommended, and how useful is the browser when using them? "We have Ollama integration" isn't very useful when there's no information about which models you should use, what works with them, what doesn't, and honestly it feels disingenuous when projects market themselves as 100% private and local and cloud-free and everything stays on your computer when the intended use case is clearly to put an OpenAI API key and send everything to OpenAI
Thank you! We have ollama integration already, you can run models locally and use that for AI chat.