Jeffry Samuel
Jeffry Samuel
Hi, I found out what was the issue, the titles didn't have wrap enabled and there's a model with a really big name, that's why the dialog wasn't resizing correctly....
That problem has to do with AMD GPUs and how they work on Ollama, I'm still trying to figure it out. About the last paragraph, you don't have to use...
Hi thanks for the suggestion, unfortunately I can't remove the focus to the messages because of accessibility (screen readers), I think I should focus the message entry textbox, that should...
should I close the issue then?
Hi you can connect your remote instance by going to preferences, this will be saved so that the integrated instance doesn't start when it connects to another one
I won't make the instance opt in, the idea of the app is to make an all in one solution to run and use AI models, if you turn on...
I am a developer, I know the advantages to having a manged docker or a separated Ollama instance, I get why there are people who prefer to do that. But...
I added this checker for the next version, if you choose to manually remove the Ollama binary the app will try to connect to the url you set in the...
> Currently the models already pulled by ollama do not show up in the model manager. I would be cool to have an auto detect and import. > > `~/.ollama/models/manifests/registry.ollama.ai/library`...
I believe this can be closed, now if Alpaca detects an Ollama binary (Flatpak plugin) it will auto create an instance if there isn't one already