camel
camel copied to clipboard
Add Local LLM - Enhancement
Would love to be able to use local LLMS like Alpaca and Llama.
@Meathelix1 Thanks for the suggestion. This will be done once we obtain a good open-source model.
@Meathelix1 Thanks for the suggestion. This will be done once we obtain a good open-source model.
any news on that, would love to let it run on local llm
I also would like to use a free, open source LLM model locally. My suggestion:
Perhaps HuggingChat v0.2 is a good alternative:
https://huggingface.co/chat/
HuggingChat v0.2 Making the community's best AI chat models available to everyone.
NEW Chat UI is now open sourced on GitHub GitHub repo https://github.com/huggingface/chat-ui
Current Model OpenAssistant/oasst-sft-6-llama-30b Model https://huggingface.co/OpenAssistant/oasst-sft-6-llama-30b-xor
page https://huggingface.co/OpenAssistant/oasst-sft-6-llama-30b-xor
Dataset https://huggingface.co/datasets/OpenAssistant/oasst1
page https://huggingface.co/datasets/OpenAssistant/oasst1
Website https://open-assistant.io/
Hi @jbdatascience, we would love to add this feature. Sorry that we are slow in development since we are a small team. If someone would like to open a PR on this, it would be great!
Hey, would like to add most adaptations i've experienced for local LLM have been made by allowing to set openAI API URL, for instance oobabooga provides an openAI API format to communicate with supported models, so does AutoGPT.
Hi @Terramoto, thanks for the great idea. It seems a lot of people are interested in this feature. Please feel free to open a PR on this!
Supporting local models is to be added with this PR: https://github.com/camel-ai/camel/pull/245.