langflow
langflow copied to clipboard
Make API keys discoverable from environment variables
Backend should check for necessary environment variables and set them up on the nodes that require them.
Probably a better way would simply acknowledge that the keys are present and use them when the chat request arrives.
@ogabrielluiz is this something one can do today? tried setting OPENAI_API_KEY as an env, with no success
I think I can work on that right away. It was possible but we made some changes lately that might have changed that.
Has this affected the solution you showed for Azure OpenAI here: #85 ? Trying to make that work and not clear on how I should be doing it.
Thank you @ogabrielluiz let me know when itβs live and how I should setup everything, so the user do not need to setup the openai keys ππ
@ogabrielluiz Is this something I can help an contribute to by ex. loading from a .envfile and checking optionally for ENV variables in the UI components for if the variables are already defined?
Hey, @MarkusSagen. This could be a good approach. LangChain already checks for them in the background if left empty but currently, we process it in a way that does not allow LangChain to do that.
What would be your plan? load the .env in the main and then in the frontend get the value from the environment and put it in the node by default?
I think I got it working in the branch https://github.com/logspace-ai/langflow/tree/20-make-api-keys-discoverable-from-environment-variables
Thank for the fix @MarkusSagen and @ogabrielluiz
what's the necessary format for the .env?
The .env isn't loaded by LangFlow, you'd have to set the variable in your environment.
We can create an issue to implement that at some point.