Nuno Campos
Nuno Campos
This is now available as the `chunk` arg to on_llm_new_token
This is great!
@vowelparrot I only caught this because I saw a similar mistake in JS while doing this https://github.com/hwchase17/langchainjs/pull/843
If it's anything like the JS package I think this issue will be present in more chains
That's message prompt templates, this is messages
@glejdis hi, which llm are you using?
Making the async methods live in the same class makes reusing helper methods across both implementations easier
@hwchase17 This is a first step towards being able to eg. run multiple independent agents in the same python process logging to different places.