llama-models icon indicating copy to clipboard operation
llama-models copied to clipboard

<|python_tag|> for JSON output

Open mmoskal opened this issue 10 months ago • 0 comments

Docs in this repo say that when using custom tool calling with JSON one should use Environment: ipython and expect <|python_tag|>{ "type": "function", "name": ...

Docs on the website say you should not use Environment: ipython and expect just plain text, without "type":"function" (ie. just {"name": "...", "parameters": ...})

Unconstrained llama 3.1 70B, with Environment: ipython outputs according to the repo docs. However, 3.3 70B outputs something like <|python_tag|>get_weather(location=...).

I'm trying to implement tool calling with OpenAI-compatible interface and need to know how to prompt the model and what to expect in answer. Note that I have a fancy constraint engine so I can actually force the model to output JSON (with the right schema) after <|python_tag|>, but I would rather avoid going off-distribution.

This is relevant for llama models hosted in Azure.

cc @datdo-msft @tot0

mmoskal avatar Jan 16 '25 19:01 mmoskal