Mark Pollack

Results 455 comments of Mark Pollack

Thanks. Will investigate. We do have a streaming test with ollama in `OllamaChatModelFunctionCallingIT` but it isn't going through ChatClient, it is using the OllamaChatModel directly. what model is you use...

We are no longer supporting the bedrock api to access chat models, have switched to using bedrock converse apis

I agree this is a gap. The issue in the example linked to by @FakeTrader is that the underlying chatmodels in this use case need to point to different openai...

As a example of what I'm thinking ``` spring: ai: openai: models: enabled: true instances: gpt4: apiKey: "your-api-key-for-gpt4" baseUrl: "https://api.openai.com" organizationId: "your-org-id" chatProperties: options: model: "gpt-4" temperature: 0.7 llama: apiKey:...

I'm tryint to collect all the different issues around this. It is true that one can do ``` var openAiApi = OpenAiApi.builder() .apiKey(System.getenv("OPENAI_API_KEY")) .build(); var openAiChatOptions = OpenAiChatOptions.builder() .model("gpt-3.5-turbo") .temperature(0.4)...

I've make a WIP PR for people to review. https://github.com/spring-projects/spring-ai/pull/3037 The flow in the tests is ``` @SpringBootTest(classes = MultiOpenAiClientIT.Config.class) @EnabledIfEnvironmentVariable(named = "GROQ_API_KEY", matches = ".+") @EnabledIfEnvironmentVariable(named = "OPENAI_API_KEY", matches...

Also note that deepseek now has it's own model implantation as it is starting to differ significantly in terms of options from openai.

See #3037 . Closing this issue for now. We should revisit a more comprehensive declarative solution post GA in another issue.

@apappascs it seems like a community project to get active incubation and feedback would be the most effective route.

I'm going to be reviewing the classes for ChatResponse and metadata, I think it was a mistake to inherit from HashMap, instead a hashmap should be used as a field...