sigma-957

Results 4 comments of sigma-957

Can confirm the results [here](https://github.com/ollama/ollama/issues/2473#issuecomment-1943449490) with my 6650XT. edit: Using rocm 6.0.0.

This is what worked for me: `export AMDGPU_TARGET=gfx1030 HSA_OVERRIDE_GFX_VERSION=10.3.0 ROCM_PATH=/opt/rocm CLBlast_DIR=/usr/lib/cmake/CLBlast` PKGBUILD: ``` pkgname=ollama-rocm pkgdesc='Create, run and share large language models (LLMs) with ROCm' pkgver=0.1.24 pkgrel=1 arch=(x86_64) url='https://github.com/jmorganca/ollama' license=(MIT) _ollamacommit=69f392c9b7ea7c5cc3d46c29774e37fdef51abd8...

Ah yes, I forgot I also had to add that env var.

I used the server provided with oobabooga and I get the same thing. Set the same API key in textwebui and emacs to replicate.