ggllm.cpp icon indicating copy to clipboard operation
ggllm.cpp copied to clipboard

Is there any GUI or Web UI for ggllm.cpp?

Open JohnClaw opened this issue 2 years ago • 13 comments

JohnClaw avatar Jul 09 '23 01:07 JohnClaw

or a http API?

chrisbward avatar Jul 09 '23 04:07 chrisbward

I want that awesome simple www server from llama.cpp as well

Screenshot 2023-07-09 175006

mirek190 avatar Jul 09 '23 16:07 mirek190

We'll definitely need something like that. Though I've a ton of features and ideas to try and only 15 hours a day time.

It will have to wait for a bit, or someone else ports that.

cmp-nct avatar Jul 10 '23 00:07 cmp-nct

Thanks @cmp-nct

I have a simple UI based off the official Falcon space uusing ggllm via ctransformers: https://huggingface.co/spaces/matthoffner/falcon-mini

I've been building spaces on HF using ggml fastapis, I have a boilerplate repo I'm working off here https://github.com/matthoffner/ggml-fastapi

matthoffner avatar Jul 11 '23 20:07 matthoffner

A modified falcon_server.cpp is as following. It can help you to build web-ui by http api. server_code.zip

hiwudery avatar Jul 12 '23 15:07 hiwudery

I have a simple UI based off the official Falcon space uusing ggllm via ctransformers: https://huggingface.co/spaces/matthoffner/falcon-mini

FYI, just errors out at the moment..

linuxmagic-mp avatar Jul 12 '23 15:07 linuxmagic-mp

I have a simple UI based off the official Falcon space uusing ggllm via ctransformers: https://huggingface.co/spaces/matthoffner/falcon-mini

FYI, just errors out at the moment..

Feel free to open an issue, it might scale down when its not being used.

matthoffner avatar Jul 12 '23 15:07 matthoffner

I meant a windows gui app or a local offline web ui that can be opened in Microsoft Edge etc.

JohnClaw avatar Jul 12 '23 18:07 JohnClaw

From my current roadmap view I'll see into performance optimizations as next step and once that is done I'll look into the best way to quickly add accessibility through a web frontend.

cmp-nct avatar Jul 15 '23 02:07 cmp-nct

I started working on a fork of llama-cpp-python for ggllm.cpp, but it's not working yet. Anyone that wants to help is more than welcome. falcon-cpp-python

sirajperson avatar Jul 18 '23 07:07 sirajperson

I started working on a fork of llama-cpp-python for ggllm.cpp, but it's not working yet. Anyone that wants to help is more than welcome. falcon-cpp-python

I think that rather than MORE forks and confusion, you might just pull from llama-cpp-python, and help make that work for both llama and falcon models. They are already working on that. Just a suggestion. I will be testing later this week on some of pre-requisites.

linuxmagic-mp avatar Jul 18 '23 23:07 linuxmagic-mp

I should bring a word of caution: we'll see huge changes with the next release. More than all previous updates combined. If time permits it will already include a minimal web-based GUI that can then be further developed/extended. I expect it to be finished within a week, though my time planning usually is off

cmp-nct avatar Jul 19 '23 02:07 cmp-nct

It seems LocalAI has support for ggllm already. (Did not try this out yet.) As it offers an OpenAI API compatible interface, you can use it in conjunction with any web-based client such as chatbot-ui.

djmaze avatar Jul 29 '23 13:07 djmaze