Nuno Campos
Nuno Campos
hi, not sure what the question is here, if you call your graph with `astream_events` we try to stream output from all things called inside it. were you expecting to...
I'm not opposed to this as long as we can do it inside the existing add_node method
You mean run them one at a time, or all of them a bit like a test?
yea then they should be tests instead, inside the langchain folder. I had thought the examples were to serve the same purpose as the python notebooks
I tried that, but it didn't fix it
Done in #124
@vladgolubev Thanks for this. I think part of the reason this wasn't done before is due to openai rate limits. Did you try passing a higher value to the existing...
@vladgolubev we've increased the default value of batchSize, do you want to give it a go on the latest version? OpenAI recommend that batches are sent in the same API...
Obviously this is using the newly released version 0.26.0
The final issue is in this line https://github.com/openai/openai-python/blob/4fee0da142f58307edb30111ed484d2d41e4811d/openai/api_requestor.py#L297 We exit the `with` statement before the response stream is consumed by the caller, therefore, unless we're using a global ClientSession, the...