CUDA 12.1 Llama-cpp-python version 0.2.84 pre-built request.
Hi,
I was working on a recent project using this framework and encountered issues while installing and building it locally. Fortunately, I discovered the prebuilt option provided by the repo, which worked really well for me. However, I now need a newer version of llama-cpp-python (0.2.84) to support Llama 3.1, but the prebuilt versions are currently unavailable.
Could you please help me out with this?
There are pre-built binaries on the Releases page: https://github.com/abetlen/llama-cpp-python/releases/tag/v0.2.84-cu121
Related https://github.com/abetlen/llama-cpp-python/issues/1627
@lsorber It's not as easy to install automatically from there, since it's not a Python Index as the READMS suggests
@abetlen The latest releases havent been pushed to GitHub Pages.
@gformcreation Don't close this, it's not fixed.
Should be fixed now