text-generation-webui icon indicating copy to clipboard operation
text-generation-webui copied to clipboard

Use locally compiled llama.cpp

Open bmtwl opened this issue 10 months ago • 3 comments

Discussed in https://github.com/oobabooga/text-generation-webui/discussions/5479

Originally posted by bmtwl February 10, 2024 Hello, I'm working on a private branch of llama.cpp to add some features for an eventual PR, but I'd like to try to use it in oobabooga ahead of any PR going in as both a kind of regression test, and because I'd like to use my feature early : ) I didn't find anything in previous discussions, the wiki, the README or anywhere else I have been able to search. Is it possible, and if so is there a documented procedure? Thanks!

I think there are a variety of reasons someone might want to use a local compile of llama.cpp. Maybe there should be an official guide with steps?

bmtwl avatar Apr 06 '24 14:04 bmtwl

I think https://github.com/abetlen/llama-cpp-python?tab=readme-ov-file#development may help

Touch-Night avatar Apr 06 '24 17:04 Touch-Night

With #5627 this would be trivially possible.

StableLlama avatar Apr 15 '24 19:04 StableLlama

Sadly #5627 was auto closed so that nobody will see that this issue is still open

StableLlama avatar Jun 02 '24 16:06 StableLlama