AgentVerse
AgentVerse copied to clipboard
如果想搭建chatGLM2-6B本地模型,应该如何更改?
@shanggangli Currently, we only support OpenAI's models. If you want to use your local models, you need to create a python script at here: https://github.com/OpenBMB/AgentVerse/tree/main/agentverse/llms and define your APIs.
I'm interested in this, can you assign it to me?
@JetSquirrel Certainly! We initially intended to use Fastchat for integrating local models, but do not have time to implement it. Fastchat offers an interface compatible to OpenAI's API for local models such as LLaMA and Vicunna, which might allow us to use our existing class for OpenAI models without needing new classes for local models. Could you first investigate Fastchat to see if it's compatible with ChatGLM and other Huggingface models?