Eric Deandrea

Results 283 comments of Eric Deandrea

@geoand I can even reproduce in some of my other projects that use OpenAI, so its unrelated to mistral... It's also unrelated to Quarkus langchain4j - using Quarkus Langchain4j 0.26.2...

@philippart-s if you downgrade the Quarkus version to `3.21.2` it should work.

Thats interesting because it builds fine on my machine :)

Actually - update the version of `langchain4j-ovh-ai` to `1.0.0-beta3`

Actually a minute or 2 after I get this: ``` Error: Unsupported features in 2 methods Detailed message: Error: Discovered unresolved type during parsing: kotlinx.coroutines.CancellableContinuationImpl. This error is reported at...

are you doing a `clean` build after changing versions? Weird that we're both getting errors, but different ones. In any event, something is broken.

> I assume you should create a CDI producer of Tokenizer that returns the same tokenizer that the chat model is using.. But we don't seem to have any examples...

I see there is an `OpenAiTokenizer` and a `HuggingFaceTokenizer`. Which one would I want if I was using Ollama with Llama 3.2?

I tried to add this: ```java @Dependent public class LangChain4jTokenizerConfig { @Produces @ApplicationScoped @UnlessBuildProfile("ollama") public Tokenizer openAITokenizer(@ConfigProperty(name = "quarkus.langchain4j.openai.chat-model.model-name") String modelName) { return new OpenAiTokenizer(modelName); } @Produces @ApplicationScoped @IfBuildProfile("ollama") public...

I will investigate this a little later this morning.