open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

How to configure the openai API proxy endpoint?

Open xjspace opened this issue 1 year ago • 4 comments

Is your feature request related to a problem? Please describe.

Hi, how to set the api proxy api instead of the official open api address? could i place it into .env? what's the ENV name?

Describe the solution you'd like

How to configure the openai API proxy endpoint?

Describe alternatives you've considered

No response

Additional context

No response

xjspace avatar Dec 06 '23 16:12 xjspace

You can use the --api_base https://host.com/v1

Or you can edit the config: interpreter --config Add api_base: "https://host.com/v1"

@xjspace let me know if this solves your question.

Notnaton avatar Dec 06 '23 16:12 Notnaton

Notnaton, strangely when I did this Openai still appears in the model name when running code. But I have been trying to run code llama through huggingface: this is the line I'm referring to: Model: openai/huggingface/codellama/CodeLlama-34b-Instruct-hf

Interpreter Info

    Vision: False
    Model: openai/huggingface/codellama/CodeLlama-34b-Instruct-hf
    Function calling: None
    Context window: 3000
    Max tokens: 400

    Auto run: False
    API base: https://api-inference.huggingface.co/models/codellama/CodeLlama-34b-Instruct-hf
    Offline: False

RisingVoicesBk avatar Jan 24 '24 19:01 RisingVoicesBk

This is because we add it, so litellm uses openai format to communicate with the endpoint. There is a change coming up to not do this anymore, next update?

Notnaton avatar Jan 25 '24 13:01 Notnaton

https://github.com/KillianLucas/open-interpreter/pull/955

Notnaton avatar Feb 12 '24 10:02 Notnaton