guidance
guidance copied to clipboard
How to connect to the custom API large model ?
Is your feature request related to a problem? Please describe. I want to connect the API's local large model, But I didn't find any sample code in the official documentation.
Describe the solution you'd like Please provide a clear guide on how to connect to the local large model of the API
Describe alternatives you've considered guidance.models.model, is it the parent class of the custom API large model? If so, to encapsulate a customized API large model, what methods of this class need to be implemented?
Additional context
@lizijian0630 I agree with you. There are big problems with docs.
If you look here you can find that it can use LiteLLM, which itself can work with Ollama or some kind popular of remote LLM service
So in theory you can work it with like this:
Your remote LLM Service -> LiteLLM -> guidance
RE: LiteLLM
OK, so how do we set api_base now?
EDIT: I looked at #648 and the rest of the code base, and the fact that the litellm tests seem to have been stubbed out makes me think that ollama/litellm support is not a priority here and that this is going down the happy path of mainstream hosted APIs being the only real test targets... Otherwise there would have been a fix based on #648 merged by now.
Originally posted by @rcarmo in https://github.com/guidance-ai/guidance/issues/687#issuecomment-2132172618
See also:
- https://github.com/guidance-ai/guidance/issues/648