local-llm-function-calling
local-llm-function-calling copied to clipboard
can this be used for function calling with other opensource models?
Hi just found this repo really interesting. I am wondering if this can be used for function calling task with other models like Llama2, falcon, vicuna, PaLm or else. Model is being used via inference endpoint just like openAI. 🤔🤔 @rizerphe
I'm currently working on fine-tuning my own models (codellama-7b with a simple qlora fine-tune performs surprisingly well) and writing an OpenAI-compatible server that'd support function calling; I'll give you an update whenever it's usable (which will hopefully be by the end of the week). Right now as far as I'm aware the only project that does this is called functionary, and it has its own caveats (not that mine won't have similar ones).
thats awesome , i am waiting excitingly for it. I need something which can handle tons of functions and call them accordingly.
btw look here: https://huggingface.co/PathOr/PathOr_LLama_70B_CHAT
Any updates here?
any update ?
any update?
any update ?
Thanks, everyone. This project should work with most open-source models that support function calls. I just implemented a prompter for my fine-tuning of codellama, and to integrate your chosen model, you just have to implement a prompter, I am, however, getting started on writing prompters for other models. I don't have access to PathOr specifically yet. What models would you like to see integrated?