guidance icon indicating copy to clipboard operation
guidance copied to clipboard

Supports accessing GPT through proxy interfaces

Open chenwenhang opened this issue 2 years ago • 1 comments

Is your feature request related to a problem? Please describe. For some reason, we cannot use openAI API directly, but use model through the proxy API. Guidance doesn't seem to be able to support this feature now.

Describe the solution you'd like Allows users to set up custom API or proxy to access LLM instead of just specifying the model by name.

This project is wonderful! Will there be plans to support this feature in the future?

chenwenhang avatar May 21 '23 10:05 chenwenhang

maybe you can try to use the endpoint param. But I'm not sure if it's really possible, I saw this through the source code.

import guidance

guidance.llm = guidance.llms.OpenAI("text-davinci-003", endpoint="xxxx")

SimFG avatar May 24 '23 07:05 SimFG

You should be able to change the api_base in the openai SDK to use the models

import guidance
guidance.llm = guidance.llms.OpenAI("gpt-4")

import openai
openai.api_base = 'YOUR_ENDPOINT_HERE'

alexlau811 avatar Jun 04 '23 07:06 alexlau811

If you set rest_call=True and endpoint when initializing guidance.llms.OpenAI, it will make a REST call to the endpoint (which can be a proxy, as long as it accepts the same arguments as the OpenAI API).

Please reopen if this doesn't solve the issue

marcotcr avatar Jun 06 '23 19:06 marcotcr

i want to know how to get the endpoint

shengyin1224 avatar Jul 24 '24 11:07 shengyin1224