Kainoa Kanter

Results 651 comments of Kainoa Kanter

I'll try again right now. I doubt it since I don't even have an iGPU...

Nope, still crashed. Here's the output of `env HOME=/var/lib/ollama HCC_AMDGPU_TARGET=gfx1030 OLLAMA_ORIGINS="*" HSA_OVERRIDE_GFX_VERSION=10.3.0 ROCM_PATH=/opt/rocm OLLAMA_DEBUG=1 ./ollama serve` and attempting to load tinyllama ``` time=2024-01-26T17:50:25.794-08:00 level=DEBUG source=/home/kainoa/.local/share/ollama-build/server/routes.go:939 msg="Debug logging enabled" time=2024-01-26T17:50:25.794-08:00 level=INFO...

Ignore deleted comment about AVX2. Still get a crash, built with `CLBlast_DIR=/usr/lib/cmake/CLBlast AMDGPU_TARGETS="gfx1030" ROCM_PATH=/opt/rocm OLLAMA_CUSTOM_CPU_DEFS="-DLLAMA_AVX=on -DLLAMA_AVX2=off" go generate ./... && go build .` ``` time=2024-01-26T18:12:39.403-08:00 level=DEBUG source=/home/kainoa/.local/share/ollama-build/server/routes.go:939 msg="Debug logging enabled"...

Unfortunately I tried 22 and it was of no help

Just fixed it!! Here's what I did: 1. Uninstall all `rocm-*` packages 2. Install `opencl-amd-dev`, `amdgpu-pro-oglp`, and `llm-clblast-git` 3. Reboot 4. `cd /opt && sudo ln -s rocm-6.0.0 rocm` 5....

(probably) related packages I have installed: ``` ❯ yay -Q | grep "opencl" opencl-amd 1:6.0.0-1 opencl-amd-dev 1:6.0.0-2 opencl-clover-mesa 1:23.3.4-3 opencl-headers 2:2023.04.17-2 opencl-rusticl-mesa 1:23.3.4-3 ❯ yay -Q | grep "clblast" clblast-git...

> yes it could be nice to type "ollama pull" and have all the models updated. Yep, that's what I was talking about

Found exactly what I was looking for. Something like this in ollama's first-party CLI would be great! https://github.com/technovangelist/ollamamodelupdater

Actually, here's a much simpler way to do it in bash: ```sh for model in $(ollama ls | tail -n +2 | awk -F ':' '{print $1}'); do ollama pull...

+1 on this. Left is VSCode, right is Zed. 3 scroll notches down and up on each, then a short flick of the scroll wheel down and up on each....