text-generation-webui
text-generation-webui copied to clipboard
Use locally compiled llama.cpp
Discussed in https://github.com/oobabooga/text-generation-webui/discussions/5479
Originally posted by bmtwl February 10, 2024 Hello, I'm working on a private branch of llama.cpp to add some features for an eventual PR, but I'd like to try to use it in oobabooga ahead of any PR going in as both a kind of regression test, and because I'd like to use my feature early : ) I didn't find anything in previous discussions, the wiki, the README or anywhere else I have been able to search. Is it possible, and if so is there a documented procedure? Thanks!
I think there are a variety of reasons someone might want to use a local compile of llama.cpp. Maybe there should be an official guide with steps?
I think https://github.com/abetlen/llama-cpp-python?tab=readme-ov-file#development may help
With #5627 this would be trivially possible.
Sadly #5627 was auto closed so that nobody will see that this issue is still open