LocalAI
LocalAI copied to clipboard
feat: support chatglm
ChatGLM project like llama.cpp https://github.com/li-plus/chatglm.cpp
good catch, def. we can add it!
:warning::warning::warning::warning::warning:
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
but.... I can also be funny or helpful :smile_cat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
:warning::warning::warning::warning::warning:
The LocalAI project has support for embeddings, but it is planned to be added according to the Hot topics / Roadmap list. It also has support for audio transcription and text-to-audio, but not GPU/CUDA support. The ChatGPT OSS alternative example exists, and image generation exists according to the News section. The Chatbot-UI and K8sGPT examples are listed in the Examples section, with a separate manual setup example for Chatbot-UI. Finally, there is no information on upstreaming golang bindings or enabling automatic downloading of models from HuggingFace, but it is mentioned as X and [X] respectively under the Hot topics / Roadmap list. There is also an OpenAI functions section, but its status is unclear according to the News section.
Sources:
- https://github.com/go-skynet/LocalAI/tree/master/examples/README.md
- https://github.com/go-skynet/LocalAI/tree/master/README.md
- https://localai.io/advanced/index.html
According to the chatGLM project README.md. I assume it can be compatible by the C++ backend. https://github.com/li-plus/chatglm.cpp#features
I have test ChatGLM. It still does not work.
@fishfree which version you test chatglm? 2 or 3? In 3, they said the have designed a new prompt format.
can access the gRPC interface service of chatglm using external-grpc-backends. Here is an example:https://github.com/szsteven008/chatglm.cpp/tree/main/examples/server