camel
camel copied to clipboard
[Feature] Add `llama.cpp` local backend model support
Description
Using OpenAI models requires connections to international Internet and OpenAI api key, which may be difficult for some users. Introducing local model backend such as llama.cpp
, which integrates large number of opensource model and hardware supports (Apple M-chips, CUDA, CPU, etc.) can be a good solution.
See llama.cpp.
@ocss884 will be working on this.