How to connect an LLM that is not on the supported list
We are using a private LLM, such as OpenAI or Gemini, which is not supported by the Unstract. To use this private LLM, various parameters like a jwt token, a subscription key, and an endpoint address are required.
We would like to test connecting this private LLM to the Unstract open-source version. Could you advise if this is possible?
Hello @ferdyhong . Thank you for reaching out to us. Apologies for the delayed reply. I would like to understand the problem better. When you say you have a private LLM, do you mean that you have an in-house built LLM or is it something that is available publicly like OpenAI or Gemini?
Hi, @gaya3-zipstack It's a bit complex, but it refers to OpenAI set up in a private cloud environment. My company has purchased OpenAI and configured it in a private cloud environment. So, it has a request structure similar to OpenAI, but with additional elements like JWT tokens incorporated.
Oh, I see. So does this mean that only the connection/configuration parameters are different? Internaly, unstract uses llama-index APIs for talking to the LLM at various points. Would like to understandwhether you have been using OpenAI client library directly or any other libs like llama-inde for your developemt so far...
Please check if this is useful and can be applied to your use-case https://docs.unstract.com/unstract/contributing/unstract/sdk/contribute-adapter/
Please check out the section - "Things to keep in mind"
Hi, @gaya3-zipstack I just realized you replied, sorry about that. Do you mean I should modify the Python file using the LlamaIndex approach to fit my environment?
You could add another adapter similar in structure to the existing ones that we have for other LLMs . You could ad special config parameters that you need to connect to this specific private LLM that you are talking about.