Tiago Pereira
Tiago Pereira
Could we make use of `partial` on the `validation_environment` method to define the env keys for the `client` and then not passing them anymore further down the pipeline? This would...
Not sure if everyone noticed it but this should now be solved since #5792 🎉
I just ran into the same issue. As far as I can see the `generate` method (and `agenerate`) of `BaseChatModel` object is not calling all of the different stages of...
I created a child class around the `ChatOpenAI` and `AzureChatOpenAI` classes which wraps those methods adding the calls to the callback_manager (just as they are done for the `BaseLLM`). Best...
I think this has been fixed meanwhile. Callbacks are now implemented differently and the `BaseChatModel` already calls them correctly ([current implementation here](https://github.com/hwchase17/langchain/blob/258c3198559da5844be3f78680f42b2930e5b64b/langchain/chat_models/base.py#L60)). I have since updated my customisation to the...
Hmm, not sure right now. Before you'd have to attach the callback to the llm, but I see that in the current implementation it should be possible to do this...
Another issue is that after instantiating an `AzureOpenAI` llm and those variables get set in the `openai` module, if we want to then instantiate a `OpenAI` llm it won't set...
Also the `AzureOpenAI` model don't support these keys as well. Only the chat version (`AzureChatOpenAI`) does
Not sure if everyone noticed it but this should now be solved since https://github.com/hwchase17/langchain/pull/5792 🎉