langflow icon indicating copy to clipboard operation
langflow copied to clipboard

how to use the azureLLMs supptort?

Open lascqy opened this issue 1 year ago • 4 comments

as the description, I've seen there are some basic support for AzureLLMs and those commits have been merged to DEV branch... However, I can't find the azure nodes when I clone the latest version to local and run...

lascqy avatar Jun 10 '23 07:06 lascqy

Hello, As the AzureLLM node is not fully developed, the commit only gave an initial support. It does not appear on the Nodes because the code is not complete.

lucaseduoli avatar Jun 16 '23 11:06 lucaseduoli

see #85

mycaule avatar Jun 16 '23 13:06 mycaule

@ogabrielluiz mentioned here a way to get this working. However, I get the error ERROR - Error: Resource not found for all of my attempts (see below). I didn't understand the suggestions in #85. I'd be grateful for more guidance 🙏

Attempt 1

Azure OpenAI Endpoint (this is the original, provided in the sample code of the Azure OpenAI Playground): https://mysubdomain.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-03-15-preview

Langflow log output shows: https://mysubdomain.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-03-15-preview/chat/completions

Attempt 2

Azure OpenAI Endpoint (after I trimmed it): https://mysubdomain.openai.azure.com/openai/deployments/gpt-35-turbo

Langflow log output shows: https://mysubdomain.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions

profplum700 avatar Jul 06 '23 16:07 profplum700

Yes please at least update the readme, azure open ai may be the first LLM managed platform, can't understand why support is not a priority.

mycaule avatar Jul 06 '23 17:07 mycaule

Hi, I am unsure how to read the information here. I have gone through the code and it seems that azure deployed LLMs are already supported. However there is no example/documentation to see how this would work. Any pointers?

trummelbummel avatar Jul 26 '23 12:07 trummelbummel

My setup for azure access is as follows: Set environment variables OPENAI_API_KEY OPENAI_API_BASE OPENAI_API_TYPE OPENAI_API_VERSION

Then launch langflow. In the UI when you create openai or chatopenai you have to click the settings, then enable the model_kwargs input. In model_kwargs you set {"deployment_id": }.

This is pretty hacky and a pain to do everytime so I think the azureopenai component, and azurechatopenai components should be added.

Just sharing my pattern for others trying to get unblocked.

I'll be looking into create a custom component for azureopenai for my teams in the next week or two.

boarder7395 avatar Aug 08 '23 20:08 boarder7395

I can't tell you how grateful I am for you sharing this 😂 I'll try this out later.

I tried it and it works. To give an example for duffers like me: I set the model_kwargs to (providing the value of my deployment id):

{"deployment_id": "gpt-35-turbo"}

image

profplum700 avatar Aug 09 '23 06:08 profplum700

I have switched to using the new custom components to do this. Such an amazing feature!

boarder7395 avatar Aug 12 '23 11:08 boarder7395

I have switched to using the new custom components to do this. Such an amazing feature!

@boarder7395 cool! do you want to share with us an example how to use it?

robertZaufall avatar Aug 12 '23 14:08 robertZaufall

@robertZaufall In the following I used the llms folder so that the custom component would appear under the LLMs tab in the UI. The top level keys in this file https://github.com/logspace-ai/langflow/blob/dev/src/backend/langflow/config.yaml define what directory name to use when creating a custom component. You can also use the custom_components/ directory instead if you want all your custom components to appear under the custom components tab.

Define your custom component as follows.

components/llms/azure_openai.py

from langchain.llms.base import BaseLLM
from langchain.llms.openai import OpenAI

from langflow import CustomComponent


class AzureOpenAI(CustomComponent):
    display_name: str = "AzureOpenAI"
    description: str = "AzureOpenAI LLM"

    def build_config(self):
        return {
            "model": {
                "multiline": False,
                "required": True,
                "options": ["text-davinci-003"],
                "value": "text-davinci-003",
            },
            "deployment_name": {
                "multiline": False,
                "required": True,
                "options": ["text-davinci-003"],
                "value": "text-davinci-003",
            },
        }

    def build(self, model: str, deployment_name: str) -> BaseLLM:
        return OpenAI(model=model, deployment_id=deployment_name)

components/llms/init.py

from azure_openai import AzureOpenAI

__all__ = [AzureOpenAI]

Dockerfile (sample)

FROM python:3.10-slim

RUN apt-get update && apt-get install gcc g++ git make default-libmysqlclient-dev build-essential pkg-config -y
RUN useradd -m -u 1000 user
USER user
ENV HOME=/home/user \
    PATH=/home/user/.local/bin:$PATH

WORKDIR $HOME/app

RUN pip install langflow==0.4.5 boto3 mysqlclient -U --user

COPY --chown=user components $HOME/app/components
COPY --chown=user config.yaml $HOME/app/

CMD ["langflow", "--host", "0.0.0.0", "--port", "7860", "--components-path", /home/user/app/components]

Here's a link to the documentation most of this is documented some I found out by digging into the code base. https://docs.langflow.org/components/custom

boarder7395 avatar Aug 14 '23 12:08 boarder7395

My setup for azure access is as follows: Set environment variables OPENAI_API_KEY OPENAI_API_BASE OPENAI_API_TYPE OPENAI_API_VERSION

Then launch langflow. In the UI when you create openai or chatopenai you have to click the settings, then enable the model_kwargs input. In model_kwargs you set {"deployment_id": }.

This is pretty hacky and a pain to do everytime so I think the azureopenai component, and azurechatopenai components should be added.

Just sharing my pattern for others trying to get unblocked.

I'll be looking into create a custom component for azureopenai for my teams in the next week or two.

in ui page, where to set OPENAI_API_VERSION ?

chensi2017 avatar Oct 12 '23 19:10 chensi2017

On ui you can use model_kwargs with openai_api_version key. Or you can set it as an environment variable.

boarder7395 avatar Oct 12 '23 22:10 boarder7395