agentica icon indicating copy to clipboard operation
agentica copied to clipboard

Gemini model integration issue with Agentica

Open Funncy opened this issue 11 months ago • 4 comments

I created a simple project using Agentica and it worked as expected. However, after switching the model to Gemini 2.0 Flash, the following issue occurred when running a basic command.

code :

import { Agentica, IAgenticaVendor } from "@agentica/core";
import OpenAI from 'openai';
import typia from 'typia';
import { BashController } from './tools/bash';
import { FileReadController } from './tools/file_read';



async function main() {

  const agent = new Agentica({
    model: "gemini",
    vendor: {
      model: "gemini-2.0-flash",
      api: new OpenAI({
        apiKey: "",
        baseURL: "https://generativelanguage.googleapis.com/v1beta/openai/",
      }),
    } satisfies IAgenticaVendor,
    controllers: [   {
      protocol: 'class',
      name: 'bash',
      application: typia.llm.application<BashController, 'gemini'>(),
      execute: new BashController(),
    },
    {
      protocol: 'class',
      name: 'file_read',
      application: typia.llm.application<FileReadController, 'gemini'>(),
      execute: new FileReadController(),
    }
  ],
  });

  const result = await agent.conversate('안녕');

  console.log(result);
}

main();

result :

/Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/typia/src/internal/_assertGuard.ts:10
    else throw new TypeGuardError(props);
               ^
TypeGuardError: Error on json.assertParse(): invalid type on $input.id, expect to be string
    at Object._assertGuard (/Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/typia/src/internal/_assertGuard.ts:10:16)
    at _ao0 (/Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/@agentica/core/src/chatgpt/ChatGptCompletionMessageUtil.ts:18:5)
    at /Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/@agentica/core/lib/chatgpt/ChatGptCompletionMessageUtil.js:408:39
    at __assert (/Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/@agentica/core/lib/chatgpt/ChatGptCompletionMessageUtil.js:413:35)
    at /Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/@agentica/core/lib/chatgpt/ChatGptCompletionMessageUtil.js:416:44
    at Object.ChatGptCompletionMessageUtil.transformCompletionChunk (/Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/@agentica/core/lib/chatgpt/ChatGptCompletionMessageUtil.js:416:91)
    at /Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/@agentica/core/src/Agentica.ts:258:42
    at /Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/@agentica/core/src/internal/StreamUtil.ts:57:30
    at Generator.next (<anonymous>)
    at fulfilled (/Users/teddy/Development/flutter/projects/heroines/heroines/agent/node_modules/@agentica/core/lib/internal/StreamUtil.js:5:58) {
  method: 'json.assertParse',
  path: '$input.id',
  expected: 'string',
  value: undefined
}

Funncy avatar Mar 19 '25 07:03 Funncy

Google Gemini does not emit the "id" property.

The ChatCompletionChunk should contain the "id" property.

I encountered this issue and received the following response:

{"choices":[{"delta":{"content":"I","role":"assistant"},"index":0}],"created":1742371939,"model":"gemini-2.0-flash","object":"chat.completion.chunk","usage":{"completion_tokens":0,"prompt_tokens":113,"total_tokens":113}}

https://github.com/openai/openai-node/blob/master/src/resources/chat/completions/completions.ts#L336-L387

sunrabbit123 avatar Mar 19 '25 08:03 sunrabbit123

@samchon

This seems to be an error caused by Google Model itself not responding to the specifications that they have defined. So, like setting tsconfig.json in Nestia, why don't you create an option that allows you to pass through even if there's an error, but only logging?

{
    ...
    "plugins": [
      { "transform": "typescript-transform-paths" },
      { "transform": "typia/lib/transform" },
      { 
        "transform": "@nestia/core/lib/transform",
        "validate": "validate",
        "stringify": "validate.log", // here
      },
    ]
    ...
}

kakasoo avatar Mar 19 '25 08:03 kakasoo

Even if you remove that validation check through monkey-patching, issues will still arise in the tool_call section.

Based on my experiment, Gemini 2.0 Flash does not provide the index in the tool_calls property. This deviates from the OpenAI SDK's expected interface.

https://x.com/SunrabbitO84776/status/1902287632308015332

sunrabbit123 avatar Mar 19 '25 09:03 sunrabbit123

https://github.com/wrtnlabs/agentica/blob/168de1e9481a7982ccaba3ed98b4f8c7308307c8/packages/core/src/chatgpt/ChatGptCompletionMessageUtil.ts#L18

Gemini is not keeping the spec...

samchon avatar Mar 19 '25 11:03 samchon