llama-recipes
llama-recipes copied to clipboard
Llama 3.1 Code Interpreter file reference
I have searched in “Model Cards and Prompt Formats” and in the Github repositories about what is the correct way to reference a file in the prompt to be used by the code interpreter but I have not found an example. In the paper, they only mention a few examples where they do it in the following way: file_path = “path/to/file”. But I'm not sure if this is the right way as sometimes it doesn't work and it gets worse when the language is not English. In the repository “llama-agentic-system” there are some examples about using files, but the code has too many layers and I couldn't get to the prompt that is sent to the model. I double checked that the code interpreter tool was enabled by the line “Environment: ipython” at the system prompt. I also tried enabling the brave_search and wolfram_alpha tools, but the result is the same. Here is the prompt I'm trying:
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Environment: ipython
Tools: brave_search, wolfram_alpha
Cutting Knowledge Date: December 2023
Today Date: 25 July 2024
You are a helpful assistant.<|eot_id|><|start_header_id|>user<|end_header_id|>
Give me a summary
file_path = "./document.docx"<|eot_id|><|start_header_id|>assistant<|end_header_id|>
Playing with the temperature parameter, sometimes the model responds with the <|python_tag|> token but sometimes not. I am using Llama 3.1 - 70B - fp8 by NeuralMagic