langflow
langflow copied to clipboard
`PromptRunner` does not support `AzureChatOpenAI` as an LLM input
Awesome project! Was playing around with the tool and hit a wall, since I only have access to Azure's OpenAI...
Describe the bug
PromptRunner
does not support AzureChatOpenAI
as an LLM input
Browser and Version
- chrome
- Version 124.0.6367.78 (Official Build) (64-bit)
To Reproduce Steps to reproduce the behavior:
- Go to 'New Project'
- Under LLMs, drag out a new
AzureChatOpenAI
- Fill in the data required for the LLM
- Under Chains, drag out a new
PromptRunner
chain - Try to connect the
BaseLanguageModel
output of theAzureChatOpenAI
block to theLLM
input on thePromptRunner
block and struggle
Expected Behavior
I should be able to connect AzureChatOpenAI
's output to the LLM
input on any block
To address the issue of connecting the BaseLanguageModel
output of the AzureChatOpenAI
block to the LLM
input on the PromptRunner
block, follow these steps:
-
Check Configuration Parameters: Ensure the configuration for
AzureChatOpenAIComponent
is correct, includingmodel
,azure_endpoint
,azure_deployment
,api_version
,api_key
,temperature
, andmax_tokens
. Theapi_version
should match the supported versions, like "2023-12-01-preview". -
Error Handling: If there's a
ValueError
, it might be due to incorrect parameters or network issues when connecting to Azure OpenAI API. This indicates a problem in the instantiation process ofAzureChatOpenAI
. -
Compatibility Check: Make sure the
PromptRunner
block is compatible with the latestAzureChatOpenAI
class. TheAzureChatOpenAI
class is designed to work withPromptRunner
blocks that acceptBaseLanguageModel
objects, but compatibility issues can arise with updates. -
Update Dependencies: Ensure all dependencies, including
langchain
,langflow
, and Azure SDKs, are current. Outdated dependencies can lead to compatibility problems. -
Browser Compatibility: Test the setup in another browser or incognito mode to rule out browser-specific issues.
-
Review Documentation: The documentation for
AzureChatOpenAIComponent
might offer additional troubleshooting steps or insights.
If these steps don't resolve the issue, providing the exact error message or behavior observed when attempting the connection could help with further troubleshooting.
Details
Sources
- src/backend/base/langflow/components/model_specs/AzureChatOpenAISpecs.py
- src/backend/base/langflow/components/models/AzureOpenAIModel.py
- src/backend/langflow/components/llms/AzureChatOpenAI.py
- src/backend/langflow/components/chains/PromptRunner.py
About Dosu This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a π if this is helpful and π if it is irrelevant.
To continue the conversation, mention @dosu.
Hello @tonydoesathing, Sorry for the delay. Did you try using the new version? Does the error still persist?
Hi @tonydoesathing
We hope you're doing well. Just a friendly reminder that if we do not hear back from you within the next 3 days, we will close this issue. If you need more time or further assistance, please let us know.β¨
Thank you for your understanding!
Sorry, can I get an extension? It's on my todo list for tomorrow! I need to get everything set up again to test
Hi @tonydoesathing,
No worries at all, we totally understand! Take your time to get everything set up and test. We'll keep the issue open and look forward to your update.
Thanks for letting us know!
Thanks! You're the best! ^-^
Running the latest, looks like AzureOpenAI
now has an LLM output exposed which connects to things π