Add alternative interface for self-hosted LLM
Instead of chatgpt would be nice to be able to have it interface with self-hosted LLM such as llama2 or similar, to avoid having to pay for api usage.
Agree! I am trying to work on llama2 version now. It will be fantastic if there are any official solutions!
我也是正在尝试在llama2 上进行 操作,但是好像不是很难,用仿 openai 的接口 可以操作。就是在脚本里更改操作模式得。得python技术好点的人。
You might want to have a look at other issues #1
FYI: I did a quick try on llama-2-7b but crashed, mostly because the model didn't generate the framework acceptable response format. Maybe llama-2-13b or 70b would work, but this is a signal that this framework may be tightly-coupled to the OpenAI models (text-davinci-002 and text-davinci-003).
I'm trying to make it work with most of the LLM, and since they all happen to have a Web interface, I've developed a Chrome plugin to embed the Ai town. https://github.com/10cl/chatdev