Scrapegraph-ai icon indicating copy to clipboard operation
Scrapegraph-ai copied to clipboard

How to custom base_url for OpenAI request?

Open thangnqs opened this issue 9 months ago • 2 comments

Hi, I use proxy custom URL request using base OpenAI, but baseURL not working This is my code:

graph_config = {
    "llm": {
        "api_key": "YOUR_API_KEY",
        "model": "gpt-3.5-turbo",
        "temperature": 0,
        "configuration": {
          "baseURL": "https://your_custom_url.com",
        },
    },
}

Any solution for this problem?

thangnqs avatar May 13 '24 04:05 thangnqs

Try this:

graph_config = {
    "llm": {
        "model": "gpt-3.5-turbo",
        "api_key": "YOUR_API_KEY",
        "temperature": 0,
        "openai_api_base": "https://your_custom_url.com",
    },
}

The recent DeepSeek implementation we added in the latest beta is something like this - it's still based on the ChatOpenAI class from LangChain but it uses a custom base URL. The code snippet above is readapted from this example. We haven't been able to test it properly but it should work.

f-aguzzi avatar May 13 '24 09:05 f-aguzzi

@f-aguzzi Thanks information. How to install branch pre/beta to test on local?

thangnqs avatar May 15 '24 03:05 thangnqs

Try the new version, it should have it

VinciGit00 avatar May 15 '24 06:05 VinciGit00

I customized the base_url, but it still gives me an error. code:

graph_config = {
    "llm": {
        "model": "gpt-3.5-turbo",
        "api_key": "sk-YjB3g***************************************A3B0",
        "temperature": 0,
        "openai_api_base": "https://free.gpt.ge",
    },
}

error: AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-YjB3g***************************************A3B0. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}

leiyu2023 avatar May 15 '24 07:05 leiyu2023

yes but you API key is not valid, pls create a new one

VinciGit00 avatar May 15 '24 08:05 VinciGit00

I customized the base_url, but it still gives me an error. code:

graph_config = {
    "llm": {
        "model": "gpt-3.5-turbo",
        "api_key": "sk-YjB3g***************************************A3B0",
        "temperature": 0,
        "openai_api_base": "https://free.gpt.ge",
    },
}

error: AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-YjB3g***************************************A3B0. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}

Yes, same issues. Though I set openai_api_base to another domain to authencation, but it still redirects authencation on openai. That's wrong!

I don't why that. I think need a new feature for custom model on base OpenAI in the next time.

thangnqs avatar May 15 '24 09:05 thangnqs

@thangnqs yes the openai class verify first if the apikey is correct in the openai endpoints and then take the model from your url, the problem is if it requires authenticatoin too. You can look into creating a standard LLM interface compatible with Langchain Reddit - CustomLLM. If you manage to do a generic CustomLLM class, plis send the PR under the models module. Thanks!

PeriniM avatar May 15 '24 10:05 PeriniM

Pls is this issue solved now? How do I call the brokering API?

lzl-hello avatar Sep 24 '24 01:09 lzl-hello