langflow icon indicating copy to clipboard operation
langflow copied to clipboard

Gemini model is reset on every load

Open charlyhue opened this issue 10 months ago • 13 comments

Bug Description

Hi,

The Gemini model is reset to learnlm-1.5-pro-experimental every time i open the flow.

Reproduction

  • Add the google generative IA block
  • Select model
  • reload the page

Expected behavior

The model should not be reset

Who can help?

No response

Operating System

langflowai/langflow Docker image

Langflow Version

1.1.2

Python Version

None

Screenshot

Image

Flow File

No response

charlyhue avatar Jan 28 '25 16:01 charlyhue

Hello @charlyhue! I'm here to help you with any bugs, questions, or contributions. Let's tackle this Gemini model reset together!

The issue you're experiencing, where the Gemini model resets to "learnlm-1.5-pro-experimental" upon reloading the page, might be related to the default model setting or state management in the Langflow codebase.

The default model for the Google Generative AI block is set to "gemini-1.5-pro" in the DropdownInput for the model_name field [1]. However, there might be an event listener or hook that resets configurations upon page reload. Specifically, the useEffect hook in the Page component resets the flow when the reactFlowInstance and currentFlowId are available, which could be affecting your model selection [2].

To address this, you might need to check if there's a mechanism to persist user-selected configurations across sessions or if there's a way to modify the default settings to prevent the reset.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

dosubot[bot] avatar Jan 28 '25 16:01 dosubot[bot]

Hey @charlyhue

Please test it in version 1.1.3. I tried to reproduce it, and what you mentioned didn’t happen. Let me know if you need anything!

carlosrcoelho avatar Feb 02 '25 23:02 carlosrcoelho

Hi, The issue is still there in 1.1.3 and 1.1.4 for Gemini and Claude. Here is the impacted flow.

Article translation.json

charlyhue avatar Feb 05 '25 10:02 charlyhue

Hi,

I'm facing the same problem. We recently upgraded our Langflow instance from 1.1.1 to 1.1.3. After that, the component that allowed us to use the Gemini model started behaving weirdly, and we can no longer use the flow via API.

bmelo avatar Feb 05 '25 12:02 bmelo

Same issue on 1,1.4.post1 and 1.1.3.dev1.

zhchin avatar Feb 09 '25 02:02 zhchin

Facing similar issue. It was working for some time, tried making another flow, the previously working one reset to learnlm-1.5-pro-experimental

ankitpise avatar Feb 09 '25 18:02 ankitpise

Same issue on 1.1.4 for me

mobinzk avatar Feb 10 '25 19:02 mobinzk

Same on 1.1.4.

Naroh091 avatar Feb 10 '25 23:02 Naroh091

same issue for agents using groq

brianmarkowitz avatar May 06 '25 11:05 brianmarkowitz

up, same issue

exs-xgg avatar May 19 '25 06:05 exs-xgg

I noticed that when opening the flow, the biref "Saving" on the top part beside the flow title changes the value for Gemini model name This script fires on flow load

Image and forces the first one on the list onto the model_name value

Image

exs-xgg avatar May 20 '25 01:05 exs-xgg

Found it: it's this one:

Image

exs-xgg avatar May 20 '25 01:05 exs-xgg

I have the same problem with version 1.4.2.

AndrzejOlender avatar May 20 '25 23:05 AndrzejOlender

I also have this issue still on 1.4.2

It appears to be the first model in the list of Gemini models, i have to keep setting it back to the one i want to use

Cratis1831 avatar Jun 02 '25 11:06 Cratis1831

Facing the same issue on 1.4.3

ariannurrahman avatar Jun 24 '25 03:06 ariannurrahman

this is my proposed fix and has been working fine with us since

def build_model(self) -> LanguageModel:  # type: ignore[type-var]
        try:
            from langchain_google_genai import ChatGoogleGenerativeAI
        except ImportError as e:
            msg = "The 'langchain_google_genai' package is required to use the Google Generative AI model."
            raise ImportError(msg) from e

        google_api_key = self.api_key
        model = self.model_name
        max_output_tokens = self.max_output_tokens
        temperature = self.temperature
        top_k = self.top_k
        top_p = self.top_p
        n = self.n

        return ChatGoogleGenerativeAI(
            model=model,
            max_output_tokens=max_output_tokens or None,
            temperature=temperature,
            top_k=top_k or None,
            top_p=top_p or None,
            n=n or 1,
            google_api_key=SecretStr(google_api_key).get_secret_value(),
        )

    def get_models(self, tool_model_enabled: bool | None = None) -> list[str]:
        try:
            import google.generativeai as genai

            genai.configure(api_key=self.api_key)
            model_ids = [
                model.name.replace("models/", "")
                for model in genai.list_models()
                if "generateContent" in model.supported_generation_methods
            ]
            model_ids.sort(reverse=True)
        except (ImportError, ValueError) as e:
            logger.exception(f"Error getting model names: {e}")
            model_ids = GOOGLE_GENERATIVE_AI_MODELS
        if tool_model_enabled:
            try:
                from langchain_google_genai.chat_models import ChatGoogleGenerativeAI
            except ImportError as e:
                msg = "langchain_google_genai is not installed."
                raise ImportError(msg) from e
            for model in model_ids:
                model_with_tool = ChatGoogleGenerativeAI(
                    model=self.model_name,
                    google_api_key=self.api_key,
                )
                if not self.supports_tool_calling(model_with_tool):
                    model_ids.remove(model)
        return model_ids

    def update_build_config(self, build_config: dotdict, field_value: Any, field_name: str | None = None):
        if field_name in {"base_url", "model_name", "tool_model_enabled", "api_key"} and field_value:
            try:
                if len(self.api_key) == 0:
                    ids = GOOGLE_GENERATIVE_AI_MODELS
                else:
                    try:
                        ids = self.get_models(tool_model_enabled=self.tool_model_enabled)
                    except (ImportError, ValueError, requests.exceptions.RequestException) as e:
                        logger.exception(f"Error getting model names: {e}")
                        ids = GOOGLE_GENERATIVE_AI_MODELS
                build_config["model_name"]["options"] = ids
                build_config["model_name"]["value"] = self.model_name
            except Exception as e:
                msg = f"Error getting model names: {e}"
                raise ValueError(msg) from e
        return build_config

I dont have time to push in a PR because of schedules but i hope this helps

exs-xgg avatar Jun 24 '25 04:06 exs-xgg

https://github.com/langflow-ai/langflow/pull/9495

Empreiteiro avatar Aug 22 '25 13:08 Empreiteiro