autogen icon indicating copy to clipboard operation
autogen copied to clipboard

[Bug]: `ValidationError` occurs when running branch `gemini`

Open OTR opened this issue 6 months ago • 4 comments

Describe the bug

First case Scenario

Given

Environment: My local Windows 10 machine Python version: Python 3.11.7 (tags/v3.11.7:fa7a6f2, Dec 4 2023, 19:24:49) [MSC v.1937 64 bit (AMD64)] on win32 autogen: version 0.2.2 branch: gemini

When

I tried to run first function from samples listed below on my local windows machine

Then

I got an error StopCandidateException ( I rarely get that exception, like floating bug)

raise generation_types.StopCandidateException(response.candidates[0])
google.generativeai.types.generation_types.StopCandidateException: index: 0
finish_reason: RECITATION

see full traceback listing below

Then when I tried to reproduce the Exception above, I got another Exception. (See step 5 from steps to reproduce)

Second Case Scenario

I tryed to reproduce the mentioned bug on Github Spaces platform.

Given:

Environment: Running on github spaces Python version:

  • Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0] on linux
    
  • Python 3.10.13 (main, Nov 29 2023, 05:20:19) [GCC 12.2.0] on linux
    

autogen: version 0.2.2 branch: gemini commit #: https://github.com/microsoft/autogen/commit/c6792a8adcd5304ad209bb575cd54237c500809c

When:

I tried to run second function from samples with gemini branch and my Google Gen AI API KEY

Then:

I got ValidationError in pydantic_core package.

pydantic_core._pydantic_core.ValidationError: 1 validation error for Choice
logprobs
  Field required [type=missing, input_value={'finish_reason': 'stop',...=None, tool_calls=None)}, input_type=dict]
    For further information visit https://errors.pydantic.dev/2.5/v/missing

See full traceback below

Steps to reproduce

Step 1

At first I tryed to install autogen package from gemini branch with following commmands (python3.10 by default):

pip install https://github.com/microsoft/autogen/archive/gemini.zip
pip install "google-generativeai" "pydash" "pillow"
pip install "pyautogen[gemini]~=0.2.0b4"

Step 2

And I got an Exception from second case scenario.

Step 3

Then I tryed to create an isolated environment with poetry and python3.11, installed autogen with following commands:

pip install https://github.com/microsoft/autogen/archive/gemini.zip
pip install "google-generativeai" "pydash" "pillow"
pip install "pyautogen[gemini]~=0.2.0b4"

Step 4

And I got an Exception from Second case scenario in pydantic_core package.

Step 5

Then I thought maybe get rid of hard coded version for the last installed package -> 0.2.0b4 and tried on my local machine within isolated poetry environment the following commands:

pip install https://github.com/microsoft/autogen/archive/gemini.zip
pip install "google-generativeai" "pydash" "pillow"
pip install "pyautogen[gemini]"

Step 6

Run just first function within main block. Comment out second function call.

Step 7

And I got StopCandidateException (First case scenario). But It is floating bug, and I managed to get it only once.

Expected Behavior

Agents should start communicating.

Screenshots and logs

Full Traceback for First case scenario :

> $ python sample.py

user_proxy (to assistant):

Write a program in python that Sort the array with Bubble Sort: [4, 1, 3, 2]

--------------------------------------------------------------------------------
Traceback (most recent call last):
  File "\home\PycharmProjects\autogen_gemini_test\sample.py", line 45, in <module>
    first()
  File "\home\PycharmProjects\autogen_gemini_test\sample.py", line 25, in first
    user_proxy.initiate_chat(assistant, message="Write a program in python that Sort the array with Bubble Sort: [4, 1, 3, 2]")
  File "\home\AppData\Local\pypoetry\Cache\virtualenvs\autogen-gemini-test-yrRBJdLh-py3.11\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 562, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "\home\AppData\Local\pypoetry\Cache\virtualenvs\autogen-gemini-test-yrRBJdLh-py3.11\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 360, in send
    recipient.receive(message, self, request_reply, silent)
  File "\home\AppData\Local\pypoetry\Cache\virtualenvs\autogen-gemini-test-yrRBJdLh-py3.11\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 493, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "\home\AppData\Local\pypoetry\Cache\virtualenvs\autogen-gemini-test-yrRBJdLh-py3.11\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 968, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "\home\AppData\Local\pypoetry\Cache\virtualenvs\autogen-gemini-test-yrRBJdLh-py3.11\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 637, in generate_oai_reply
    response = client.create(
               ^^^^^^^^^^^^^^
  File "\home\AppData\Local\pypoetry\Cache\virtualenvs\autogen-gemini-test-yrRBJdLh-py3.11\Lib\site-packages\autogen\oai\client.py", line 274, in create
    response = client.call(params)
               ^^^^^^^^^^^^^^^^^^^
  File "\home\AppData\Local\pypoetry\Cache\virtualenvs\autogen-gemini-test-yrRBJdLh-py3.11\Lib\site-packages\autogen\oai\gemini.py", line 93, in call
    response = chat.send_message(gemini_messages[-1].parts[0].text, stream=stream)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "\home\AppData\Local\pypoetry\Cache\virtualenvs\autogen-gemini-test-yrRBJdLh-py3.11\Lib\site-packages\google\generativeai\generative_models.py", line 384, in send_message
    raise generation_types.StopCandidateException(response.candidates[0])
google.generativeai.types.generation_types.StopCandidateException: index: 0
finish_reason: RECITATION

Full Traceback for Second case scenario :

> $ python simple_chat.py 

user_proxy (to assistant):

Write a program in python that Sort the array with Bubble Sort: [4, 1, 3, 2]

--------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/workspaces/autogen/samples/simple_chat.py", line 46, in <module>
    another()
  File "/workspaces/autogen/samples/simple_chat.py", line 25, in another
    user_proxy.initiate_chat(assistant, message="Write a program in python that Sort the array with Bubble Sort: [4, 1, 3, 2]")
  File "/home/vscode/.cache/pypoetry/virtualenvs/samples-czj8q62m-py3.11/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 562, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "/home/vscode/.cache/pypoetry/virtualenvs/samples-czj8q62m-py3.11/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 360, in send
    recipient.receive(message, self, request_reply, silent)
  File "/home/vscode/.cache/pypoetry/virtualenvs/samples-czj8q62m-py3.11/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 493, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/vscode/.cache/pypoetry/virtualenvs/samples-czj8q62m-py3.11/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 968, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/vscode/.cache/pypoetry/virtualenvs/samples-czj8q62m-py3.11/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 637, in generate_oai_reply
    response = client.create(
               ^^^^^^^^^^^^^^
  File "/home/vscode/.cache/pypoetry/virtualenvs/samples-czj8q62m-py3.11/lib/python3.11/site-packages/autogen/oai/client.py", line 274, in create
    response = client.call(params)
               ^^^^^^^^^^^^^^^^^^^
  File "/home/vscode/.cache/pypoetry/virtualenvs/samples-czj8q62m-py3.11/lib/python3.11/site-packages/autogen/oai/gemini.py", line 123, in call
    choices = [Choice(finish_reason="stop", index=0, message=message)]
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/vscode/.cache/pypoetry/virtualenvs/samples-czj8q62m-py3.11/lib/python3.11/site-packages/pydantic/main.py", line 164, in __init__
    __pydantic_self__.__pydantic_validator__.validate_python(data, self_instance=__pydantic_self__)
pydantic_core._pydantic_core.ValidationError: 1 validation error for Choice
logprobs
  Field required [type=missing, input_value={'finish_reason': 'stop',...=None, tool_calls=None)}, input_type=dict]
    For further information visit https://errors.pydantic.dev/2.5/v/missing

Additional Information

Code samples I tried to run

from autogen import UserProxyAgent, ConversableAgent, config_list_from_json, AssistantAgent

config_list_gemini = [{
        "model": "gemini-pro",
        "api_key": "AIza-my-api-key",
        "api_type": "google"
}]

config_list_gemini_vision = [{
        "model": "gemini-pro-vision",
        "api_key": "AIza-my-api-key",
        "api_type": "google"
}]

def first():
    assistant = AssistantAgent("assistant",
                           llm_config={"config_list": config_list_gemini, "seed": 42},
                           max_consecutive_auto_reply=13)

    user_proxy = UserProxyAgent("user_proxy",
                                code_execution_config={"work_dir": "coding", "use_docker": False},
                                human_input_mode="NEVER",
                            is_termination_msg = lambda x: content_str(x.get("content")).find("TERMINATE") >= 0)

    user_proxy.initiate_chat(assistant, message="Write a program in python that Sort the array with Bubble Sort: [4, 1, 3, 2]")


def second():

    assistant = ConversableAgent("agent", llm_config={"config_list": config_list_gemini})

    user_proxy = UserProxyAgent("user", code_execution_config=True)

    assistant.initiate_chat(user_proxy, message="How can I help you today?")


if __name__ == "__main__":
    first()
    second()

Packages installed for the first case:

$ pip freeze

  • annotated-types==0.6.0
  • anyio==4.2.0
  • cachetools==5.3.2
  • certifi==2023.11.17
  • charset-normalizer==3.3.2
  • colorama==0.4.6
  • diskcache==5.6.3
  • distro==1.9.0
  • FLAML==2.1.1
  • google-ai-generativelanguage==0.4.0
  • google-api-core==2.15.0
  • google-auth==2.26.1
  • google-generativeai==0.3.2
  • googleapis-common-protos==1.62.0
  • grpcio==1.60.0
  • grpcio-status==1.60.0
  • h11==0.14.0
  • httpcore==1.0.2
  • httpx==0.26.0
  • idna==3.6
  • numpy==1.26.3
  • openai==1.6.1
  • pillow==10.2.0
  • proto-plus==1.23.0
  • protobuf==4.25.1
  • pyasn1==0.5.1
  • pyasn1-modules==0.3.0
  • pyautogen @ https://github.com/microsoft/autogen/archive/gemini.zip#sha256=a7ecbc81aa9279dde95be5ef7e33a8cd1733d90db3622c0f29ca8a53b4de511c
  • pydantic==2.5.3
  • pydantic_core==2.14.6
  • pydash==7.0.6
  • python-dotenv==1.0.0
  • regex==2023.12.25
  • requests==2.31.0
  • rsa==4.9
  • sniffio==1.3.0
  • termcolor==2.4.0
  • tiktoken==0.5.2
  • tqdm==4.66.1
  • typing_extensions==4.9.0
  • urllib3==2.1.0

Packages installed for the second case:

$ pip freeze

  • annotated-types==0.6.0
  • anyio==4.2.0
  • cachetools==5.3.2
  • certifi==2023.11.17
  • charset-normalizer==3.3.2
  • diskcache==5.6.3
  • distro==1.9.0
  • FLAML==2.1.1
  • google-ai-generativelanguage==0.4.0
  • google-api-core==2.15.0
  • google-auth==2.26.1
  • google-generativeai==0.3.2
  • googleapis-common-protos==1.62.0
  • grpcio==1.60.0
  • grpcio-status==1.60.0
  • h11==0.14.0
  • httpcore==1.0.2
  • httpx==0.26.0
  • idna==3.6
  • numpy==1.26.3
  • openai==1.6.1
  • pillow==10.2.0
  • proto-plus==1.23.0
  • protobuf==4.25.1
  • pyasn1==0.5.1
  • pyasn1-modules==0.3.0
  • pyautogen @ https://github.com/microsoft/autogen/archive/gemini.zip#sha256=a7ecbc81aa9279dde95be5ef7e33a8cd1733d90db3622c0f29ca8a53b4de511c
  • pydantic==2.5.3
  • pydantic_core==2.14.6
  • pydash==7.0.6
  • python-dotenv==1.0.0
  • regex==2023.12.25
  • requests==2.31.0
  • rsa==4.9
  • sniffio==1.3.0
  • termcolor==2.4.0
  • tiktoken==0.5.2
  • tqdm==4.66.1
  • typing_extensions==4.9.0
  • urllib3==2.1.0

OTR avatar Jan 04 '24 16:01 OTR