Amith Koujalgi
Amith Koujalgi
Here's my current understanding of LLMs. I am still in the process of learning, so please forgive any inaccuracies and feel free to correct me if needed. > Embeddings are...
Thanks @oliverhummel for reporting the issue. It has been fixed now and the fix is available in version [1.0.82](https://github.com/ollama4j/ollama4j/releases/tag/1.0.82). For managing long conversations with the model, you can use the...
@oliverhummel Thanks! Glad it works well for you. About the history, I do not have much experience of it but what I have observed is that even ChatGPT or any...
@porchy13 Thanks for the update. The API has been updated to use the bearer auth if set.
Thanks @samie, sounds like a great idea. But I am not so well versed with Vaadin setup. I'll have to look around to see how to accomplish this.
Absolutely! I appreciate your willingness to help. It would definitely be great to make these updates. Looking forward to collaborating!
> Is this possible to disable stream via code? Not available at the moment. But why would one need to disable streaming mode? Isn't it a good experience to have...
@samie It seems like the issue is coming from the `OllamaChatResponseModel.getMessage()` method in Ollama4j library, where the model didn't return a message. I think throwing an NPE wouldn't be the...
Thanks for the information @liebki. Please feel free to correct it and create a PR.
I hope [this](https://github.com/ollama4j/ollama4j/commit/18760250eaf6f99b60f8d8e8a8b3336504db9bac) adds a distinction.