autolabel icon indicating copy to clipboard operation
autolabel copied to clipboard

[Bug]: Gemini failed in Vertex AI workbench

Open LeoMai2024 opened this issue 3 months ago • 1 comments

Describe the bug Hi AutoLabel team, Amazing project! Currently, I am trying to use autolabel in Vertex AI workbench, I successfully installed autolabel library and I have Gemini pro access. However, the notebook will raise error when I tried to use Gemini. The only difference my situation with documentation is that I don't have API key in workbench since I use Gemini access from project. Could you help me understand how can I use it? use Gemini config: "model": { "provider": "google", "name": "gemini-pro", # I also tried "gemini-1.0-pro" both raise same error as follows "params": {} },

Error Screenshots

ValidationError Traceback (most recent call last) Cell In[18], line 2 1 # create an aggent for labeling ----> 2 agent = LabelingAgent(config=config)

File /opt/conda/lib/python3.10/site-packages/autolabel/labeler.py:96, in LabelingAgent.init(self, config, cache, example_selector, create_task, console_output, generation_cache, transform_cache) 92 self.config = ( 93 config if isinstance(config, AutolabelConfig) else AutolabelConfig(config) 94 ) 95 self.task = TaskFactory.from_config(self.config) ---> 96 self.llm: BaseModel = ModelFactory.from_config( 97 self.config, cache=self.generation_cache 98 ) 99 score_type = "logprob_average" 100 if self.config.task_type() == TaskType.ATTRIBUTE_EXTRACTION:

File /opt/conda/lib/python3.10/site-packages/autolabel/models/init.py:47, in ModelFactory.from_config(config, cache) 45 try: 46 model_cls = MODEL_REGISTRY[provider] ---> 47 model_obj = model_cls(config=config, cache=cache) 48 # The below ensures that users should based off of the BaseModel 49 # when creating/registering custom models. 50 assert isinstance( 51 model_obj, BaseModel 52 ), f"{model_obj} should inherit from autolabel.models.BaseModel"

File /opt/conda/lib/python3.10/site-packages/autolabel/models/palm.py:68, in PaLMLLM.init(self, config, cache) 66 self.llm = ChatVertexAI(model_name=self.model_name, **self.model_params) 67 else: ---> 68 self.llm = VertexAI(model_name=self.model_name, **self.model_params)

File /opt/conda/lib/python3.10/site-packages/langchain/load/serializable.py:74, in Serializable.init(self, **kwargs) 73 def init(self, **kwargs: Any) -> None: ---> 74 super().init(**kwargs) 75 self._lc_kwargs = kwargs

File /opt/conda/lib/python3.10/site-packages/pydantic/main.py:341, in pydantic.main.BaseModel.init()

ValidationError: 1 validation error for VertexAI root Unknown model publishers/google/models/gemini-1.0-pro; {'gs://google-cloud-aiplatform/schema/predict/instance/text_generation_1.0.0.yaml': <class 'vertexai.preview.language_models._PreviewTextGenerationModel'>} (type=value_error)

Additional context Add any other context about the problem here.

LeoMai2024 avatar Mar 31 '24 20:03 LeoMai2024

cc: @Vaibhav2001

rishabh-bhargava avatar Mar 31 '24 23:03 rishabh-bhargava