llm-client-sdk
llm-client-sdk copied to clipboard
SDK for using LLM
Unlike OpenAI which provided you with an apikey. Google's own convoluted authentication methods involves service account and all sorts of tokens. Personally, I've tried running `gcloud auth print-access-token` and use...
You can see [here](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_call_functions_with_chat_models.ipynb) the openai-cookbook for using functions with [openai-python](https://github.com/openai/openai-python). The biggest problem for adding support for it in [OpenAIClient](https://github.com/uripeled2/llm-client-sdk/blob/main/llm_client/llm_api_client/openai_client.py) through [chat_completion](https://github.com/uripeled2/llm-client-sdk/blob/main/llm_client/llm_api_client/openai_client.py#L41) is that openai functions return an object...
@aharonYK think you can take it? I think it will be similar to https://github.com/uripeled2/llm-client-sdk/issues/23
We can add and abstract method to BaseLLMClient like: `def list_models(**kwargs) -> list[str]` And then we need to implement it in the different clients, we can add a static constants...
Implement MosaicML BaseLLMAPIClient, find out more about MosaicML API docs [here](https://docs.mosaicml.com/en/latest/inference.html#api-reference). You can find the steps of adding BaseLLMAPIClient in the end of the [README.md](https://github.com/uripeled2/llm-client-sdk#readme)
Implement Cohere BaseLLMAPIClient, find out more about Cohere API docs [here](https://docs.cohere.com/reference/about). You can find the steps of adding BaseLLMAPIClient in the end of the [README.md](https://github.com/uripeled2/llm-client-sdk#readme)