topicGPT icon indicating copy to clipboard operation
topicGPT copied to clipboard

Issue with Vertexai API generation config

Open Kcarreras opened this issue 9 months ago • 0 comments

I'm encountering the following error when using generate_topic_lvl1, passing "vertex" as api and "gemini-2.0-flash-001" as model:

File "/root/.local/lib/python3.10/site-packages/topicgpt_python/utils.py", line 234, in iterative_prompt response = self.model_obj.generate_content( File "/root/.local/lib/python3.10/site-packages/google/generativeai/generative_models.py", line 305, in generate_content request = self._prepare_request( File "/root/.local/lib/python3.10/site-packages/google/generativeai/generative_models.py", line 156, in _prepare_request generation_config = generation_types.to_generation_config_dict(generation_config) File "/root/.local/lib/python3.10/site-packages/google/generativeai/types/generation_types.py", line 224, in to_generation_config_dict raise TypeError( TypeError: Invalid input type. Expected a dict or GenerationConfig for generation_config. However, received an object of type: <class 'vertexai.generative_models._generative_models.GenerationConfig'>. Object Value: temperature: 0.0 top_p: 1.0 max_output_tokens: 1000

There seems to be some issue with this part of utils.py when a "gemini" model is specified:

"else: config = GenerationConfig( max_output_tokens=max_tokens, temperature=temperature, top_p=top_p, )"

potentially a different format is expected for these models? It is not clear to me. Hopefully someone can clarify. Thank you.

Kcarreras avatar Mar 21 '25 14:03 Kcarreras