oatmealm

Results 106 comments of oatmealm

I'm still running into problems with gfx900 support. When installing ollama using the script I get this: ``` ➜ ollama run llama3:instruct Error: llama runner process has terminated: signal: aborted...

Solved: I'm not sure if I installed the roclabs package manually or not, but I think it's installed as part of, and under the same path, as the rest of...

Not working for me either. I have lobe-chat in a container and ollama on baremetal macos. Ollama is accesible from open-webui, every extension for chrome I've triend, emacs, command line...

Got it. Was also wondering were is the text (of the comments) stored? Is it hashed with the highlights meta-data?

Do overlays support this scenario of variable width fonts? Most often it seems to work just fine.

I found my self in the same situation. Moved and existing doom install to make room for chemacs, and was getting the same error. First tryed `sync` followed by `build`,...

+1 would be great

I see it's not happening anymore... anyways, thanks for the link to the forum. I'll read more about it.

I'm jumping in to ask about using a remote installation of ollama. When trying something like `http://10.0.0.10:11434` as the url_base it can't find the server. The server is accessible remotely...

Just to say that the project is really impressive (I've tried with a local server as well). thanks!