m-from-space
m-from-space
As a workaround you can exclude your voice folder from being used on-demand when alternating the corresponding line in `progressive_download.txt` which you can find in your game folder.
@arthurwolf You can try building using the following, it worked for me. `CUDACXX=/usr/local/cuda-12/bin/nvcc CMAKE_ARGS="-DLLAMA_CUBLAS=on -DCMAKE_CUDA_ARCHITECTURES=native" FORCE_CMAKE=1 pip install llama-cpp-python --no-cache-dir --force-reinstall --upgrade`
Does ctransformers use the correct Llama instruction template? Does it use anything at all or do I have to make sure it's using it myself? I am talking about this:...
Problem is still present in `llama-cpp-python==0.3.2`
Unfortunately the problem persists using version `llama-cpp-python==0.3.9` Could this be an issue with `llama.cpp` itself? Shouldn't there be an easy fix, since it only seems to about the seed not...
Still an issue in version `0.3.12`. Please randomize the initial seed! 🥲
My suggestion: - uninstall `nvidia-cuda-toolkit` on Ubuntu - install 12.3 cuda toolkit from here: https://developer.nvidia.com/cuda-downloads - build using: `CUDACXX=/usr/local/cuda-12/bin/nvcc CMAKE_ARGS="-DLLAMA_CUBLAS=on -DCMAKE_CUDA_ARCHITECTURES=native" FORCE_CMAKE=1 pip install llama-cpp-python --force-reinstall --upgrade`
> When will we have a recent version of llama-cpp-python, functional with CUDA, via pip? I probably misunderstand your question, but I *am* using 0.3.9 with CUDA via pip. I...
> Man... I've tried for hours to get similar commands to work, never succeeded, I used JamePeng's wheel to make up for it. I just tried your command and it...
I remember having these kinds of issues in my early `llama.cpp` days. The reason back then was my former CPU (not GPU!) that was not able to handle certain instruction...