autogen4j icon indicating copy to clipboard operation
autogen4j copied to clipboard

Another Client compatible with OpenAI

Open feuyeux opened this issue 1 year ago • 1 comments

Search before asking

  • [X] I had searched in the issues and found no similar feature requirement.

Description

There is no longer an AiClient that is compatible with OpenAI, capable of accessing other Large Language Model (LLM) Open APIs, and also has access to local large models.

What I desire in a Java version is likely present in a Python version, where llm_config can function with other LLM Open APIs.

Use case

var joe = AssistantAgent.builder()
                .client(OpenAiClient.builder()
                        .openaiApiBase(URL)
                        .openaiApiKey(API_KEY)
                        .build().init())
                .name("joe")
                .systemMessage("Your name is Joe and you are a part of a duo of comedians.")
                .humanInputMode(NEVER)
                .build();

https://github.com/feuyeux/hello-autogen/blob/main/hello-autogen-java/src/test/java/org/feuyeux/ai/autogen/HelloAutogenTests.java

local_llm_config = {
    "config_list": [
        {
            "model": "NotRequired",  # Loaded with LiteLLM command
            "api_key": "NotRequired",  # Not needed
            "base_url": "http://0.0.0.0:4000"  # Your LiteLLM URL
        }
    ],
    "cache_seed": None  # Turns off caching, useful for testing different models
}

joe = ConversableAgent(
    "joe",
    system_message="Your name is Joe and you are a part of a duo of comedians.",
    llm_config=local_llm_config,
    human_input_mode="NEVER",  # Never ask for human input.
)

https://github.com/feuyeux/hello-autogen/blob/main/hello-autogen-python/hello_autogen.py

feuyeux avatar May 29 '24 12:05 feuyeux

When you used autogen4j from dependencies, you might see this in openai-client:0.2.2 OpenAiClient.class specifies it to have either of OPENAI_PROXY & OPENAI_API_KEY , OPENAI_ORGANIZATION to get it rolling ` this.openaiProxy = this.getOrEnvOrDefault(this.openaiProxy, "OPENAI_PROXY");

OkHttpClient.Builder httpClientBuilder = (new OkHttpClient.Builder()).connectTimeout(this.requestTimeout, TimeUnit.SECONDS).readTimeout(this.requestTimeout, TimeUnit.SECONDS).writeTimeout(this.requestTimeout, TimeUnit.SECONDS).callTimeout(this.requestTimeout, TimeUnit.SECONDS);

    httpClientBuilder.addInterceptor((chain) -> {

        this.openaiApiKey = this.getOrEnvOrDefault(this.openaiApiKey, "OPENAI_API_KEY");

        this.openaiOrganization = this.getOrEnvOrDefault(this.openaiOrganization, "OPENAI_ORGANIZATION", "");

        Request.Builder requestBuilder = chain.request().newBuilder();
        requestBuilder.header("Content-Type", "application/json");
        if (this.isAzureApiType()) {
            requestBuilder.header("api-key", this.openaiApiKey);
        } else {
            requestBuilder.header("Authorization", "Bearer " + this.openaiApiKey);
            requestBuilder.header("OpenAI-Organization", this.openaiOrganization);
        }

` To change this to use local llm and model from a url, another generic client needs to be setup similar to one python offers, then as a feature would be nice to have a switch between using openAI Client and LocalLLM client to fine tune as per flag. Can help add in this feature so it helps support both models and increase extension.

katemamba avatar Jun 08 '24 03:06 katemamba