screenshot-to-code icon indicating copy to clipboard operation
screenshot-to-code copied to clipboard

Gemini 2.0 API Key Configuration Unclear

Open 6p5ra opened this issue 10 months ago • 7 comments

Unable to determine the correct location and variable name for inserting the Gemini 2.0 API key in the project configuration. To Reproduce

Attempt to integrate Gemini 2.0 API Search through configuration files Unable to identify correct key insertion point Experiencing configuration uncertainty

Additional Details

Specific questions:

Where should the Gemini 2.0 API key be placed? What is the exact variable name for the API key?

6p5ra avatar Jan 23 '25 04:01 6p5ra

Hi there, you can set GEMINI_API_KEY in backend/.env

and then, in generate_code.py set variant_models = [Llm.GPT_4O_2024_11_20, Llm.GEMINI_2_0_FLASH_EXP] or whatever LLM combo you want to have. Make sure to set this right before this line: for index, model in enumerate(variant_models):

Let me know if this works.

abi avatar Jan 24 '25 02:01 abi

replaced with and now it works

variant_models = []

            # For creation, use Claude Sonnet 3.6 but it can be lazy
            # so for updates, we use Claude Sonnet 3.5
            if generation_type == "create":
                claude_model = Llm.CLAUDE_3_5_SONNET_2024_10_22
            else:
                claude_model = Llm.CLAUDE_3_5_SONNET_2024_06_20

            if openai_api_key and anthropic_api_key and GEMINI_API_KEY:
                variant_models = [
                    claude_model,
                    Llm.GPT_4O_2024_11_20,
                    Llm.GEMINI_2_0_FLASH_EXP
                ]
            elif openai_api_key and GEMINI_API_KEY:
                variant_models = [
                    Llm.GPT_4O_2024_11_20,
                    Llm.GEMINI_2_0_FLASH_EXP,
                ]
            elif anthropic_api_key and GEMINI_API_KEY:
                variant_models = [
                    claude_model,
                    Llm.GEMINI_2_0_FLASH_EXP,
                ]
            elif openai_api_key and anthropic_api_key:
                variant_models = [
                    claude_model,
                    Llm.GPT_4O_2024_11_20,
                ]
            elif openai_api_key:
                variant_models = [
                    Llm.GPT_4O_2024_11_20,
                    Llm.GPT_4O_2024_11_20,
                ]
            elif anthropic_api_key:
                variant_models = [
                    claude_model,
                    Llm.CLAUDE_3_5_SONNET_2024_06_20,
                ]
            elif GEMINI_API_KEY:
                variant_models = [
                    Llm.GEMINI_2_0_FLASH_EXP,
                    Llm.GEMINI_2_0_FLASH_EXP,
                ]
            else:
                await throw_error(
                    "No OpenAI, Anthropic, or Gemini API key found. Please add the environment variable OPENAI_API_KEY, ANTHROPIC_API_KEY, or GEMINI_API_KEY to backend/.env or in the settings dialog. If you add it to .env, make sure to restart the backend server."
                )
                raise Exception("No API keys found")

6p5ra avatar Jan 24 '25 19:01 6p5ra

except generating from video. I dont know why but Flash 2.0 supports video inputs

6p5ra avatar Jan 24 '25 19:01 6p5ra

Yup, video support is very alpha and currently only works with Claude Opus. It hasn't been updated in a while.

abi avatar Jan 25 '25 17:01 abi

replaced with and now it works

variant_models = []

            # For creation, use Claude Sonnet 3.6 but it can be lazy
            # so for updates, we use Claude Sonnet 3.5
            if generation_type == "create":
                claude_model = Llm.CLAUDE_3_5_SONNET_2024_10_22
            else:
                claude_model = Llm.CLAUDE_3_5_SONNET_2024_06_20

            if openai_api_key and anthropic_api_key and GEMINI_API_KEY:
                variant_models = [
                    claude_model,
                    Llm.GPT_4O_2024_11_20,
                    Llm.GEMINI_2_0_FLASH_EXP
                ]
            elif openai_api_key and GEMINI_API_KEY:
                variant_models = [
                    Llm.GPT_4O_2024_11_20,
                    Llm.GEMINI_2_0_FLASH_EXP,
                ]
            elif anthropic_api_key and GEMINI_API_KEY:
                variant_models = [
                    claude_model,
                    Llm.GEMINI_2_0_FLASH_EXP,
                ]
            elif openai_api_key and anthropic_api_key:
                variant_models = [
                    claude_model,
                    Llm.GPT_4O_2024_11_20,
                ]
            elif openai_api_key:
                variant_models = [
                    Llm.GPT_4O_2024_11_20,
                    Llm.GPT_4O_2024_11_20,
                ]
            elif anthropic_api_key:
                variant_models = [
                    claude_model,
                    Llm.CLAUDE_3_5_SONNET_2024_06_20,
                ]
            elif GEMINI_API_KEY:
                variant_models = [
                    Llm.GEMINI_2_0_FLASH_EXP,
                    Llm.GEMINI_2_0_FLASH_EXP,
                ]
            else:
                await throw_error(
                    "No OpenAI, Anthropic, or Gemini API key found. Please add the environment variable OPENAI_API_KEY, ANTHROPIC_API_KEY, or GEMINI_API_KEY to backend/.env or in the settings dialog. If you add it to .env, make sure to restart the backend server."
                )
                raise Exception("No API keys found")

Hi, @6p5ra and @abi

When I use the GEMINI model, I encounter the error in the picture when it requests reordering. I would like your support regarding this issue.

Excellent work. Thanks.

ERROR:

Error generating code. Please contact support.
Traceback (most recent call last):
 File "/home/user/stc/backend/backend/llm.py", line 269, in stream_gemini_response
   if content_part["type"] == "image_url":  # type: ignore
      ~~~~~~~~~~~~^^^^^^^^
TypeError: string indices must be integers, not 'str'
Traceback (most recent call last):
 File "/home/user/stc/backend/backend/llm.py", line 269, in stream_gemini_response
   if content_part["type"] == "image_url":  # type: ignore
      ~~~~~~~~~~~~^^^^^^^^
TypeError: string indices must be integers, not 'str'
ERROR:    Exception in ASGI application
Traceback (most recent call last):
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 250, in run_asgi
   result = await self.app(self.scope, self.asgi_receive, self.asgi_send)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
   return await self.app(scope, receive, send)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
   await super().__call__(scope, receive, send)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/applications.py", line 113, in __call__
   await self.middleware_stack(scope, receive, send)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/middleware/errors.py", line 152, in __call__
   await self.app(scope, receive, send)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/middleware/cors.py", line 77, in __call__
   await self.app(scope, receive, send)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
   await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
   raise exc
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
   await app(scope, receive, sender)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/routing.py", line 715, in __call__
   await self.middleware_stack(scope, receive, send)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/routing.py", line 735, in app
   await route.handle(scope, receive, send)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/routing.py", line 362, in handle
   await self.app(scope, receive, send)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/routing.py", line 95, in app
   await wrap_app_handling_exceptions(app, session)(scope, receive, send)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
   raise exc
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
   await app(scope, receive, sender)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/starlette/routing.py", line 93, in app
   await func(session)
 File "/home/user/miniforge3/envs/dev-env/lib/python3.12/site-packages/fastapi/routing.py", line 383, in app
   await dependant.call(**solved_result.values)
 File "/home/user/stc/backend/routes/generate_code.py", line 372, in stream_code
   raise Exception("All generations failed")
Exception: All generations failed
INFO:     connection closed

Image

https://github.com/user-attachments/assets/d721cc4e-3927-4f5e-a7a2-9a15f39724e9

therkut avatar Feb 08 '25 10:02 therkut

Looks like follow-up prompts are failing but first prompts are working? Ah, I think currently the repo only supports Gemini for the initial generation. The code needs to be modified to do a better translation of the messages from the OpenAI format to the Gemini format for follow-ups to work. So yeah, the Gemini implementation is a little hacky at the moment.

abi avatar Feb 09 '25 15:02 abi

@abi Yes, the initial requests are working, but the correction requests are showing an error on the screen. I am eagerly waiting for the code to be fixed.

Thanks

therkut avatar Feb 09 '25 16:02 therkut