llama3 icon indicating copy to clipboard operation
llama3 copied to clipboard

Will llama 3 have function calling support in future?

Open Greatz08 opened this issue 10 months ago • 8 comments

In #78 it is stated that it is not currently been supported so my question is that will that be supported in future/ is it in road map of llama 3?if yes then any approx date upto which we can expect this?if no then why not :-() when it can help in making own tools and use with autogen

Greatz08 avatar Apr 20 '24 10:04 Greatz08

Waiting for update.

Mandeep0001 avatar Apr 20 '24 17:04 Mandeep0001

Yeah , it should since LLM's without function calling are not enterprise ready

kushagradeep avatar Apr 20 '24 20:04 kushagradeep

Others do see reasonable results with Llama 3 8B function calling: https://www.reddit.com/r/LocalLLaMA/comments/1c7jtwh/function_calling_template_for_llama_3/

zoltan-fedor avatar Apr 21 '24 03:04 zoltan-fedor

This would be highly beneficial. I would love to see a model that is fine-tuned for this, including parallel function calling.

aronbrand avatar Apr 21 '24 13:04 aronbrand

Maybe look at guidance? I have used llama3 with it, and it supports function calling.

https://github.com/guidance-ai/guidance/?tab=readme-ov-file#automatic-call-grammar-for-guidance-functions

xrd avatar Apr 21 '24 15:04 xrd

With llama 70B I get good structured outputs. Didn't try function calling yet, but that should work in some scenarios i guess. Will let know when I have tested.

teis-e avatar Apr 30 '24 22:04 teis-e

I did try function calling by making a function-calling ReAct template for Llama 3 70B (see https://www.reddit.com/r/LocalLLaMA/comments/1c7jtwh/comment/l1dksmx/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button)

It seems to be working reasonably well.

I have tried it with a large number of functions yet, but I have tested it and it seems to work well with 2-3 functions, 2-3 parameters per function. I haven't reached its limit yet where it would start breaking down.

zoltan-fedor avatar Apr 30 '24 23:04 zoltan-fedor

@init27 Should this be closed the same as https://github.com/meta-llama/llama-recipes/issues/442 due to 3.1 supporting <|python_tag|>

codefromthecrypt avatar Aug 26 '24 05:08 codefromthecrypt