Andrei Bondarev

Results 180 comments of Andrei Bondarev

@scarrick68 It doesn't support Ollama just yet, but have you looked at the [Langchain::Assistant](https://github.com/patterns-ai-core/langchainrb?tab=readme-ov-file#assistants)?

@scarrick68 It doesn't support because Ollama because Ollama doesn't officially tools. There's hacky ways around it but I thought we'd wait till Ollama builds that capability themselves first.

@scarrick68 There's a way to make the Tool Calling work by putting instructions and tool declarations in the prompt itself, in the "XML-like" format. Have you looked into it?

@scarrick68 Check this out: https://x.com/ollama/status/1793392887612260370?s=46&t=ZgKczGvaONuo_4dgGcUd_Q We should definitely implement it!

@scarrick68 The Assistant supports Ollama now. I still think we should have a better way to persisting messages. I'm open to ideas!

@kokuyouwind Thank you for this proposal! I'm curious do you have a need for this in your applications? Would this make passing the system role easier for you?

@kokuyouwind Thank you for your PR and using this library in your gem 😄 I'd like to actually think through this after the Langchain::Assistant Anthropic support is added here: https://github.com/patterns-ai-core/langchainrb/issues/543....

@kokuyouwind I've been thinking that the `#chat(messages: [])` method could accept the `Langchain::Messages::*` instances directly. For example: ```ruby message_1 = Langchain::Messages::AnthropicMessage.new(role:"user", content:"hi!") message_2 = Langchain::Messages::AnthropicMessage.new(role:"assistant", content:"Hey! How can I help?")...

@palladius I think we can close this one, right?

@bricesanchez Thank you for the PR, looks like the specs are failing.