Gemini model is reset on every load
Bug Description
Hi,
The Gemini model is reset to learnlm-1.5-pro-experimental every time i open the flow.
Reproduction
- Add the google generative IA block
- Select model
- reload the page
Expected behavior
The model should not be reset
Who can help?
No response
Operating System
langflowai/langflow Docker image
Langflow Version
1.1.2
Python Version
None
Screenshot
Flow File
No response
Hello @charlyhue! I'm here to help you with any bugs, questions, or contributions. Let's tackle this Gemini model reset together!
The issue you're experiencing, where the Gemini model resets to "learnlm-1.5-pro-experimental" upon reloading the page, might be related to the default model setting or state management in the Langflow codebase.
The default model for the Google Generative AI block is set to "gemini-1.5-pro" in the DropdownInput for the model_name field [1]. However, there might be an event listener or hook that resets configurations upon page reload. Specifically, the useEffect hook in the Page component resets the flow when the reactFlowInstance and currentFlowId are available, which could be affecting your model selection [2].
To address this, you might need to check if there's a mechanism to persist user-selected configurations across sessions or if there's a way to modify the default settings to prevent the reset.
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other
Hey @charlyhue
Please test it in version 1.1.3. I tried to reproduce it, and what you mentioned didn’t happen. Let me know if you need anything!
Hi, The issue is still there in 1.1.3 and 1.1.4 for Gemini and Claude. Here is the impacted flow.
Hi,
I'm facing the same problem. We recently upgraded our Langflow instance from 1.1.1 to 1.1.3. After that, the component that allowed us to use the Gemini model started behaving weirdly, and we can no longer use the flow via API.
Same issue on 1,1.4.post1 and 1.1.3.dev1.
Facing similar issue. It was working for some time, tried making another flow, the previously working one reset to learnlm-1.5-pro-experimental
Same issue on 1.1.4 for me
Same on 1.1.4.
same issue for agents using groq
up, same issue
I noticed that when opening the flow, the biref "Saving" on the top part beside the flow title changes the value for Gemini model name This script fires on flow load
and forces the first one on the list onto the model_name value
Found it: it's this one:
I have the same problem with version 1.4.2.
I also have this issue still on 1.4.2
It appears to be the first model in the list of Gemini models, i have to keep setting it back to the one i want to use
Facing the same issue on 1.4.3
this is my proposed fix and has been working fine with us since
def build_model(self) -> LanguageModel: # type: ignore[type-var]
try:
from langchain_google_genai import ChatGoogleGenerativeAI
except ImportError as e:
msg = "The 'langchain_google_genai' package is required to use the Google Generative AI model."
raise ImportError(msg) from e
google_api_key = self.api_key
model = self.model_name
max_output_tokens = self.max_output_tokens
temperature = self.temperature
top_k = self.top_k
top_p = self.top_p
n = self.n
return ChatGoogleGenerativeAI(
model=model,
max_output_tokens=max_output_tokens or None,
temperature=temperature,
top_k=top_k or None,
top_p=top_p or None,
n=n or 1,
google_api_key=SecretStr(google_api_key).get_secret_value(),
)
def get_models(self, tool_model_enabled: bool | None = None) -> list[str]:
try:
import google.generativeai as genai
genai.configure(api_key=self.api_key)
model_ids = [
model.name.replace("models/", "")
for model in genai.list_models()
if "generateContent" in model.supported_generation_methods
]
model_ids.sort(reverse=True)
except (ImportError, ValueError) as e:
logger.exception(f"Error getting model names: {e}")
model_ids = GOOGLE_GENERATIVE_AI_MODELS
if tool_model_enabled:
try:
from langchain_google_genai.chat_models import ChatGoogleGenerativeAI
except ImportError as e:
msg = "langchain_google_genai is not installed."
raise ImportError(msg) from e
for model in model_ids:
model_with_tool = ChatGoogleGenerativeAI(
model=self.model_name,
google_api_key=self.api_key,
)
if not self.supports_tool_calling(model_with_tool):
model_ids.remove(model)
return model_ids
def update_build_config(self, build_config: dotdict, field_value: Any, field_name: str | None = None):
if field_name in {"base_url", "model_name", "tool_model_enabled", "api_key"} and field_value:
try:
if len(self.api_key) == 0:
ids = GOOGLE_GENERATIVE_AI_MODELS
else:
try:
ids = self.get_models(tool_model_enabled=self.tool_model_enabled)
except (ImportError, ValueError, requests.exceptions.RequestException) as e:
logger.exception(f"Error getting model names: {e}")
ids = GOOGLE_GENERATIVE_AI_MODELS
build_config["model_name"]["options"] = ids
build_config["model_name"]["value"] = self.model_name
except Exception as e:
msg = f"Error getting model names: {e}"
raise ValueError(msg) from e
return build_config
I dont have time to push in a PR because of schedules but i hope this helps
https://github.com/langflow-ai/langflow/pull/9495