llama3
llama3 copied to clipboard
Will llama 3 have function calling support in future?
In #78 it is stated that it is not currently been supported so my question is that will that be supported in future/ is it in road map of llama 3?if yes then any approx date upto which we can expect this?if no then why not :-() when it can help in making own tools and use with autogen
Waiting for update.
Yeah , it should since LLM's without function calling are not enterprise ready
Others do see reasonable results with Llama 3 8B function calling: https://www.reddit.com/r/LocalLLaMA/comments/1c7jtwh/function_calling_template_for_llama_3/
This would be highly beneficial. I would love to see a model that is fine-tuned for this, including parallel function calling.
Maybe look at guidance? I have used llama3 with it, and it supports function calling.
https://github.com/guidance-ai/guidance/?tab=readme-ov-file#automatic-call-grammar-for-guidance-functions
With llama 70B I get good structured outputs. Didn't try function calling yet, but that should work in some scenarios i guess. Will let know when I have tested.
I did try function calling by making a function-calling ReAct template for Llama 3 70B (see https://www.reddit.com/r/LocalLLaMA/comments/1c7jtwh/comment/l1dksmx/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button)
It seems to be working reasonably well.
I have tried it with a large number of functions yet, but I have tested it and it seems to work well with 2-3 functions, 2-3 parameters per function. I haven't reached its limit yet where it would start breaking down.
@init27 Should this be closed the same as https://github.com/meta-llama/llama-recipes/issues/442 due to 3.1 supporting <|python_tag|>