langflow icon indicating copy to clipboard operation
langflow copied to clipboard

Anthropic Component does not accept "system_message"

Open w3XeWIRpuND opened this issue 1 year ago • 1 comments

Bug Description

The Anthropic model component in our system is not accepting input from any source, including hard-coded messages in the unhidden value. This severely limits the functionality of the component.

When attempting to use the component with a system message, the following error is encountered:

ValueError: Error running graph:
Error building Component
Anthropic:

Expected mapping type as input to
ChatPromptTemplate. Received
<class 'list'>.

Reproduction

  1. Set up the components as shown in the provided screenshot.
  2. Add a system message to the input.
  3. Attempt to pass input (including the system message) to the Anthropic model component.
  4. Observe that the component fails to accept the input.

Expected behavior

The Anthropic model component should be able to accept input, including a system message, from various sources such as:

  • Other components in the system
  • Hard-coded messages in the unhidden value
  • User input
  • Any other valid input source

Who can help?

No response

Operating System

Pop OS 24.04

Langflow Version

1.0.12

Python Version

None

Screenshot

image

Flow File

No response

w3XeWIRpuND avatar Jul 26 '24 14:07 w3XeWIRpuND

To resolve the issue where the Anthropic model component is not accepting the system_message and is throwing a ValueError related to the ChatPromptTemplate, you need to ensure that the input_value parameter is correctly passed and utilized in the build_model method. Additionally, make sure that the system_message is correctly handled.

Here is the relevant code snippet with the necessary adjustments:

from langchain_anthropic.chat_models import ChatAnthropic
from pydantic.v1 import SecretStr

from langflow.base.constants import STREAM_INFO_TEXT
from langflow.base.models.model import LCModelComponent
from langflow.field_typing import LanguageModel
from langflow.io import BoolInput, DropdownInput, FloatInput, IntInput, MessageTextInput, Output, SecretStrInput

class AnthropicModelComponent(LCModelComponent):
    display_name = "Anthropic"
    description = "Generate text using Anthropic Chat&Completion LLMs with prefill support."
    icon = "Anthropic"

    inputs = [
        MessageTextInput(name="input_value", display_name="Input"),
        IntInput(
            name="max_tokens",
            display_name="Max Tokens",
            advanced=True,
            value=4096,
            info="The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
        ),
        DropdownInput(
            name="model",
            display_name="Model Name",
            options=[
                "claude-3-5-sonnet-20240620",
                "claude-3-opus-20240229",
                "claude-3-sonnet-20240229",
                "claude-3-haiku-20240307",
            ],
            info="https://python.langchain.com/docs/integrations/chat/anthropic",
            value="claude-3-5-sonnet-20240620",
        ),
        SecretStrInput(
            name="anthropic_api_key",
            display_name="Anthropic API Key",
            info="Your Anthropic API key.",
        ),
        FloatInput(name="temperature", display_name="Temperature", value=0.1),
        MessageTextInput(
            name="anthropic_api_url",
            display_name="Anthropic API URL",
            advanced=True,
            info="Endpoint of the Anthropic API. Defaults to 'https://api.anthropic.com' if not specified.",
        ),
        BoolInput(name="stream", display_name="Stream", info=STREAM_INFO_TEXT, advanced=True, value=False),
        MessageTextInput(
            name="system_message",
            display_name="System Message",
            info="System message to pass to the model.",
            advanced=True,
        ),
        MessageTextInput(
            name="prefill",
            display_name="Prefill",
            info="Prefill text to guide the model's response.",
            advanced=True,
        ),
    ]
    outputs = [
        Output(display_name="Text", name="text_output", method="text_response"),
        Output(display_name="Language Model", name="model_output", method="build_model"),
    ]

    def build_model(self) -> LanguageModel:
        model = self.model
        anthropic_api_key = self.anthropic_api_key
        max_tokens = self.max_tokens
        temperature = self.temperature
        anthropic_api_url = self.anthropic_api_url or "https://api.anthropic.com"

        try:
            output = ChatAnthropic(
                model=model,
                anthropic_api_key=(SecretStr(anthropic_api_key) if anthropic_api_key else None),
                max_tokens_to_sample=max_tokens,  # type: ignore
                temperature=temperature,
                anthropic_api_url=anthropic_api_url,
                streaming=self.stream,
            )
        except Exception as e:
            raise ValueError("Could not connect to Anthropic API.") from e

        return output

    def _get_exception_message(self, exception: Exception) -> str | None:
        """
        Get a message from an Anthropic exception.

        Args:
            exception (Exception): The exception to get the message from.

        Returns:
            str: The message from the exception.
        """
        try:
            from anthropic import BadRequestError
        except ImportError:
            return None
        if isinstance(exception, BadRequestError):
            message = exception.body.get("error", {}).get("message")  # type: ignore
            if message:
                return message
        return None

Ensure that the input_value and system_message are correctly passed to the build_model method. If the build_model method is not handling these parameters properly, you may need to adjust it accordingly. Additionally, make sure that the api_endpoint parameter is correctly set to the default value if not provided [1][2][3][4].

To continue talking to Dosu, mention @dosu.

dosubot[bot] avatar Jul 26 '24 14:07 dosubot[bot]

issue disappeared after upgrading langflow containers to "latest"

w3XeWIRpuND avatar Jul 27 '24 02:07 w3XeWIRpuND