anything-llm icon indicating copy to clipboard operation
anything-llm copied to clipboard

[FEAT]: OPENAI BASE params

Open sherry0429 opened this issue 5 months ago • 9 comments

What would you like to see?

npm package openai have a config api base, like a proxy settings.

if someone only want to use OpenAI API rather than any llm service, this config will help a lot.

sherry0429 avatar Feb 01 '24 03:02 sherry0429

Ah, so like being able to hookup any "generic" service that also is fully compatible with the OpenAI API schema via just overriding the baseURL?

timothycarambat avatar Feb 01 '24 17:02 timothycarambat

Yes, this technology is widely used in OpenAI LLM integrators. They define an intermediary site and generate an intermediary API key for users. Users use this pair (host, api_key) to access their servers, and they forward the requests to OpenAI. For example, in the request:

curl https://api.openai.com/v1/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "gpt-3.5-turbo-instruct",
    "prompt": "Say this is a test",
    "max_tokens": 7,
    "temperature": 0
  }'

The https://{OPENAI_API_BASE}/v1/completions and $OPENAI_API_KEY will be replaced with the values from the LLM integrator, which implements all of OpenAI's public interfaces.

In anythingLLM, When users utilize OpenAI models, they can overwrite the request's host address, enabling them to utilize any proxy service integrated with OpenAI and implementing OpenAI's public API. This proves beneficial for those facing poor network conditions and desiring to utilize custom services or plugins.

sherry0429 avatar Feb 02 '24 02:02 sherry0429

I am not sure in openai npm official package how to set is, but in python

import openai
openai.api_base = ""   #  custom api base, like https://test.api/v1
openai.api_key = ""     #  custom key

sherry0429 avatar Feb 02 '24 02:02 sherry0429

Furthermore, it would be beneficial if AnythingLLM could provide a list of the OpenAI APIs it utilizes. This could potentially further ignite enthusiasm among similar middleware or integrators to integrate AnythingLLM.

sherry0429 avatar Feb 02 '24 02:02 sherry0429

@timothycarambat could anything llm add this feature in future ? thanks

sherry0429 avatar Feb 04 '24 09:02 sherry0429

The following code, with the addition of the basepath parameter, should be able to specify a proxy, but during testing there were errors that I couldn't figure out where the problem was.

const config = new Configuration({ apiKey: process.env.OPEN_AI_KEY, basePath: "http://xxx.xxx.xxx.xxx:3000/v1" });

2024-02-07T08:37:30.839326169Z OpenAI:listModels Request failed with status code 401 2024-02-07T08:37:42.421404929Z [TELEMETRY SENT] { 2024-02-07T08:37:42.421448968Z event: 'api_key_created', 2024-02-07T08:37:42.421457794Z distinctId: 'b988b14e-f0d5-4fe1-a3f4-48ec0fd63266', 2024-02-07T08:37:42.421463778Z properties: { runtime: 'docker' } 2024-02-07T08:37:42.421469076Z } 2024-02-07T08:46:33.544755978Z Error: OpenAI::CreateModeration failed with: Request failed with status code 503 2024-02-07T08:46:33.544802705Z at /app/server/utils/AiProviders/openAi/index.js:106:15 2024-02-07T08:46:33.544810634Z at process.processTicksAndRejections (node:internal/process/task_queues:95:5) 2024-02-07T08:46:33.544815584Z at async OpenAiLLM.isSafe (/app/server/utils/AiProviders/openAi/index.js:95:50) 2024-02-07T08:46:33.544820165Z at async streamChatWithWorkspace (/app/server/utils/chats/stream.js:35:34) 2024-02-07T08:46:33.544824563Z at async /app/server/endpoints/chat.js:94:9

Jiongguang avatar Feb 07 '24 09:02 Jiongguang

The error here is because the chats flow through a moderation endpoint that normally exists for openAI, but given the provider URL and API key changed. it is going to 503 since its not a valid key.

We will make a "generic" provider that is basically openAI, but you can modify the baseURL and API key. We dont want to edit the OpenAi method mostly because many people would not need this, but it would be nice to have the generic wrapper that functions in the way you have outlined.

timothycarambat avatar Feb 07 '24 14:02 timothycarambat

Thanks for your reply

Jiongguang avatar Feb 18 '24 11:02 Jiongguang

👋 everyone, My team is working on the use case mentioned above by @sherry0429. I think a generic OpenAI LLM wrapper is a wonderful feature that will enhance the appeal of anthingLLM

kloudtaxi avatar Feb 26 '24 14:02 kloudtaxi

Please i need this. Currently this is there in big-agi and i want to switch to anything llm but this option is missing. If there is extra input that can set openai base url that would be great.

krishna-praveen avatar Apr 21 '24 13:04 krishna-praveen

I too am looking for a way to change the Base URL. Any guidance would be appreciated.

dantosXD avatar Apr 22 '24 18:04 dantosXD