java-ai-playground icon indicating copy to clipboard operation
java-ai-playground copied to clipboard

I want to run this locally with my LLM, such as Ollama.

Open nevenc opened this issue 1 year ago • 4 comments

I want to be able to run this application locally with my LLM, such as Ollama.

The application now requires openai.api.key in LangChain4jConfig. No way to skip the langchain4j-openai creation or provide an alternative langchain4j-ollama

In Spring AI it's easy to use just a different starter and app works without the change. spring-ai-ollama-spring-boot-starter instead of spring-ai-openai-spring-boot-starter

nevenc avatar Mar 14 '24 10:03 nevenc

Fair point. The intent of the project was to show how the different libraries compare, so I haven't focused too much on making it easy to change models, so like you said, you'd need to do code changes in the LangChain4j example if you want to use ollama.

Need to look into it. Do you have ideas on how we could make the langchain code support easier swapping of LLMs?

marcushellberg avatar Mar 14 '24 16:03 marcushellberg

I think the easiest way to do it would be swapping spring profiles (Ex: "open-ai", "ollama"). Default could be "open-ai"

ygoron360 avatar Mar 15 '24 10:03 ygoron360

First of all - Marcus - thanks for putting this demo application - I loved it! It makes an exciting demo of technologies!

I agree with @ygoron360 - probably best with profiles, the easiest something like this:

@Profile("openai")
@Configuration
public class OpenAiConfiguration {
    ...
}

@Profile("ollama")
@Configuration
public class OllamaConfiguration {
    ...
}

nevenc avatar Mar 15 '24 13:03 nevenc

@nevenc just FYI we just added a Quarkus version on the quarkus branch and you can do exactly what you're looking to do.

The README describes how to do that.

edeandrea avatar Nov 07 '24 20:11 edeandrea