Miguel Ferrada Gutiérrez

Results 9 comments of Miguel Ferrada Gutiérrez

That is to use openai isn't? It works fine. Just I need to use azure. :D How can I do? Thank you!

This is my problem, when I try, it not work. It returns: “PS E:\Users\MFG\Desktop\Programming\Eos_2\open-interpreter> interpreter --model azure/gpt-4-turbo > hi We were unable to determine the context window of thismodel. Defaulting...

Well, finally I solved. If I call with: "interpreter -ab https://oaiwestus.openai.azure.com/ -av 2023-07-01-preview", it works. It seems the problem is it don't read this lines in my config.yaml. (Because it...

(Thanks team for the development!) :)

I discovered what was wrong in my config.yaml. I wrote "llm.api.base" instead "llm.api_base", and "llm.azure.api_version" instead llm.api_version". Now I can run azure writing only "interpreter". My config.yaml as example for...

try "llm.supports_functions: False" in your config.yaml

same here. I install with pip -e. then I update llama-index to 0.1-14, same error before and after the update