Scrapegraph-ai
Scrapegraph-ai copied to clipboard
How to custom base_url for OpenAI request?
Hi, I use proxy custom URL request using base OpenAI, but baseURL
not working
This is my code:
graph_config = {
"llm": {
"api_key": "YOUR_API_KEY",
"model": "gpt-3.5-turbo",
"temperature": 0,
"configuration": {
"baseURL": "https://your_custom_url.com",
},
},
}
Any solution for this problem?
Try this:
graph_config = {
"llm": {
"model": "gpt-3.5-turbo",
"api_key": "YOUR_API_KEY",
"temperature": 0,
"openai_api_base": "https://your_custom_url.com",
},
}
The recent DeepSeek
implementation we added in the latest beta is something like this - it's still based on the ChatOpenAI
class from LangChain
but it uses a custom base URL. The code snippet above is readapted from this example. We haven't been able to test it properly but it should work.
@f-aguzzi Thanks information. How to install branch pre/beta
to test on local?
Try the new version, it should have it
I customized the base_url, but it still gives me an error. code:
graph_config = {
"llm": {
"model": "gpt-3.5-turbo",
"api_key": "sk-YjB3g***************************************A3B0",
"temperature": 0,
"openai_api_base": "https://free.gpt.ge",
},
}
error: AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-YjB3g***************************************A3B0. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}
yes but you API key is not valid, pls create a new one
I customized the base_url, but it still gives me an error. code:
graph_config = { "llm": { "model": "gpt-3.5-turbo", "api_key": "sk-YjB3g***************************************A3B0", "temperature": 0, "openai_api_base": "https://free.gpt.ge", }, }
error: AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-YjB3g***************************************A3B0. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}
Yes, same issues. Though I set openai_api_base
to another domain to authencation, but it still redirects authencation on openai. That's wrong!
I don't why that. I think need a new feature for custom model on base OpenAI in the next time.
@thangnqs yes the openai class verify first if the apikey is correct in the openai endpoints and then take the model from your url, the problem is if it requires authenticatoin too. You can look into creating a standard LLM interface compatible with Langchain Reddit - CustomLLM. If you manage to do a generic CustomLLM class, plis send the PR under the models module. Thanks!
Pls is this issue solved now? How do I call the brokering API?