Sanyam Bhutani
Sanyam Bhutani
Ok in that case, I will close the PR. The expected flow would be-you should run/install the `llama-recipes` package/repo and then run the examples. Please let me know if you...
@himanshushukla12 Yeah a simple app would be a cool idea-thanks for suggesting! Quick Note: 11B and 90B both support single-image at the time (you cannot change to different image later...
Thank you, appreciate your interest! Yeah a gradio app with all the HF params for inference would be a great contribution!
Feel free to ask for help!
@himanshushukla12 sorry for missing this-looks great! Can you please make sure the CI/CD is green and merge?
1. Sometimes copying from the browser can make it inconsistent-can you please double check? 2. Can you please also try renewing the URL since it expires every 24h Please also...
@BaiqingL closing the issue for now-but please let us know if you would still be interested in sending a PR! Would love to help and collaborate. Thanks!
@pbontrager thanks and thanks @musabgultekin for quick PR! Actually I need some clarification from the team: Context: When we do tool call, a single flow is: > (step-1) user (with...
@musabgultekin thanks for the reply, actually referring to `Tokens` section [here](https://github.com/meta-llama/llama-models/blob/main/models/llama3_3/prompt_format.md#tokens) > : End of message. A message represents a possible stopping point for execution where the model can inform...
@ebsmothers Actually, @RdoubleA and I agree it should be: `eot = False if (role == 'tool' or (role == 'assistant' and next_message_role == 'tool')) else True` It seems we need...