skyagi
skyagi copied to clipboard
how to use open-source models?
Please add examples using local open-source models, like llama or chatGLM. Thanks
thx for the advice. Just try to understand better about you needs. You need local open-source models because of the concern about gpt cost? or from the privacy perspective? or other reasons?
I think we should add this, it's an easy win and open doors for many efforts @yuyuan223 @basicmi
thx for the advice. Just try to understand better about you needs. You need local open-source models because of the concern about gpt cost? or from the privacy perspective? or other reasons?
more than that, I think we can play with different models in the same game and compare them.
I saw in the video the ability to select open source models, but when I ran skyagi I found that it was not possible to choose. Is it not supported yet?
We are working on the local/cloud-hosted open-source LLM support, with the help of https://github.com/tensorchord/modelz-llm.
It should be available next week.
I successfully run skyAGI with lmsys/fastchat-t5-3b-v1.0 on my PC, without the GPU:
# Terminal 1
modelz-llm -m lmsys/fastchat-t5-3b-v1.0 --device cpu
# Terminal 2
export OPENAI_API_BASE=http://localhost:8000
export OPENAI_API_KEY="anystring"
skyagi
I have some workarounds to
- change the embedding size (from 1536 to 384 because I am using
sentence-transformers/all-MiniLM-L6-v2
which generates 384-dimension vectors) #92 - Upgrade to latest langchain (Could use openai-gpt-3.5-text-davinci-003 to avoid #93 )
I will write a doc for it.
Thank you @gaocegege for writing a doc, because so far I can't make it work:
pip install modelz-llm transformers sentencepiece accelerate
modelz-llm -m lmsys/fastchat-t5-3b-v1.0 --device cpu
works, but skyagi
doesn't add fastchat to its list.