MetaGPT
MetaGPT copied to clipboard
debate example fail to work with gemini
Bug description debate example throws error with gemini-pro 1.5. Websearch works with gemini-pro
Bug solved method
Environment information Python 3.9 Conda
- LLM type and model name: Gemini-Pro
- System version:
- Python version: 3.9
Screenshots or logs
python3 debate.py "Talk about Artificial General Intelligence"
2024-03-25 17:57:01.666 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to /Users/samsaha2
2024-03-25 17:57:03.800 | INFO | metagpt.team:invest:90 - Investment: $3.0.
2024-03-25 17:57:03.801 | INFO | main:_act:63 - Biden(Democrat): to do SpeakAloud(SpeakAloud)
2024-03-25 17:57:06.072 | WARNING | metagpt.utils.common:wrapper:572 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory.
2024-03-25 17:57:06.081 | ERROR | metagpt.utils.common:wrapper:554 - Exception occurs, start to serialize the project, exp:
Traceback (most recent call last):
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/utils/common.py", line 563, in wrapper
return await func(self, *args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/roles/role.py", line 558, in run
rsp = await self.react()
ValueError: The response.text
quick accessor only works for simple (single-Part
) text responses. This response is not simple text.Use the result.parts
accessor or the full result.candidates[index].content.parts
lookup instead.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/utils/common.py", line 549, in wrapper
result = await func(self, *args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/team.py", line 134, in run
await self.env.run()
Exception: Traceback (most recent call last):
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/utils/common.py", line 563, in wrapper
return await func(self, *args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/roles/role.py", line 558, in run
rsp = await self.react()
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/roles/role.py", line 525, in react
rsp = await self._react()
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/roles/role.py", line 471, in _react
rsp = await self._act()
File "/Users/samsaha2/debate.py", line 70, in _act
rsp = await todo.run(context=context, name=self.name, opponent_name=self.opponent_name)
File "/Users/samsaha2/debate.py", line 41, in run
rsp = await self._aask(prompt)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/actions/action.py", line 93, in _aask
return await self.llm.aask(prompt, system_msgs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/provider/base_llm.py", line 89, in aask
rsp = await self.acompletion_text(message, stream=stream, timeout=timeout)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/tenacity/_asyncio.py", line 47, in call
do = self.iter(retry_state=retry_state)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/tenacity/init.py", line 314, in iter
return fut.result()
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/concurrent/futures/_base.py", line 439, in result
return self.__get_result()
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/concurrent/futures/_base.py", line 391, in __get_result
raise self._exception
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/tenacity/_asyncio.py", line 50, in call
result = await fn(*args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/provider/google_gemini_api.py", line 147, in acompletion_text
return await self._achat_completion_stream(messages)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/provider/google_gemini_api.py", line 127, in _achat_completion_stream
content = chunk.text
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/google/generativeai/types/generation_types.py", line 328, in text
raise ValueError(
ValueError: The response.text
quick accessor only works for simple (single-Part
) text responses. This response is not simple text.Use the result.parts
accessor or the full result.candidates[index].content.parts
lookup instead.
It's blocked by gemini:
[category: HARM_CATEGORY_SEXUALLY_EXPLICIT
probability: NEGLIGIBLE
, category: HARM_CATEGORY_HATE_SPEECH
probability: NEGLIGIBLE
, category: HARM_CATEGORY_HARASSMENT
probability: NEGLIGIBLE
, category: HARM_CATEGORY_DANGEROUS_CONTENT
probability: NEGLIGIBLE
]
hahahaha~~
My prompt for debate is the following. " "Talk about Artificial General Intelligence" " How to debug if it is blocked I am just running on command line as below.
python3 debate.py "Talk about Artificial General Intelligence"
Try examples/debate_simple.py
?
remove line 17 and line 19 to use your config in the file.
Got the following error.
Modified debate_simple.py ` #!/usr/bin/env python
-- coding: utf-8 --
""" @Time : 2023/12/22 @Author : alexanderwu @File : debate_simple.py """ import asyncio
from metagpt.actions import Action from metagpt.config2 import Config from metagpt.environment import Environment from metagpt.roles import Role from metagpt.team import Team
gpt35 = Config.default() #gpt35.llm.model = "gpt-3.5-turbo-1106" gpt4 = Config.default() #gpt4.llm.model = "gpt-4-1106-preview" action1 = Action(config=gpt4, name="AlexSay", instruction="Express your opinion with emotion and don't repeat it") action2 = Action(config=gpt35, name="BobSay", instruction="Express your opinion with emotion and don't repeat it") alex = Role(name="Alex", profile="Democratic candidate", goal="Win the election", actions=[action1], watch=[action2]) bob = Role(name="Bob", profile="Republican candidate", goal="Win the election", actions=[action2], watch=[action1]) env = Environment(desc="US election live broadcast") team = Team(investment=10.0, env=env, roles=[alex, bob])
asyncio.run(team.run(idea="Topic: Talk about Artificial General Intelligence", send_to="Alex", n_round=5))
`
python3 debate_simple.py 2024-03-25 19:13:06.759 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to /Users/samsaha2 2024-03-25 19:13:08.703 | INFO | metagpt.roles.role:_act:399 - Alex(Democratic candidate): to do Action(AlexSay) 2024-03-25 19:13:12.573 | WARNING | metagpt.utils.common:wrapper:572 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory. 2024-03-25 19:13:12.582 | ERROR | metagpt.utils.common:wrapper:554 - Exception occurs, start to serialize the project, exp: Traceback (most recent call last): File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/utils/common.py", line 563, in wrapper return await func(self, *args, **kwargs) File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/roles/role.py", line 558, in run rsp = await self.react() ValueError: The
response.text quick accessor only works for simple (single-
Part) text responses. This response is not simple text.Use the
result.partsaccessor or the full
result.candidates[index].content.parts` lookup instead.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/utils/common.py", line 549, in wrapper
result = await func(self, *args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/team.py", line 134, in run
await self.env.run()
Exception: Traceback (most recent call last):
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/utils/common.py", line 563, in wrapper
return await func(self, *args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/roles/role.py", line 558, in run
rsp = await self.react()
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/roles/role.py", line 525, in react
rsp = await self._react()
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/roles/role.py", line 471, in _react
rsp = await self._act()
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/roles/role.py", line 400, in _act
response = await self.rc.todo.run(self.rc.history)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/actions/action.py", line 105, in run
return await self._run_action_node(*args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/actions/action.py", line 100, in _run_action_node
return await self.node.fill(context=context, llm=self.llm)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/actions/action_node.py", line 505, in fill
return await self.simple_fill(schema=schema, mode=mode, images=images, timeout=timeout, exclude=exclude)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/actions/action_node.py", line 463, in simple_fill
self.content = await self.llm.aask(prompt)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/provider/base_llm.py", line 89, in aask
rsp = await self.acompletion_text(message, stream=stream, timeout=timeout)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/tenacity/_asyncio.py", line 47, in call
do = self.iter(retry_state=retry_state)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/tenacity/init.py", line 314, in iter
return fut.result()
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/concurrent/futures/_base.py", line 439, in result
return self.__get_result()
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/concurrent/futures/_base.py", line 391, in __get_result
raise self._exception
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/tenacity/_asyncio.py", line 50, in call
result = await fn(*args, **kwargs)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/provider/google_gemini_api.py", line 147, in acompletion_text
return await self._achat_completion_stream(messages)
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/metagpt/provider/google_gemini_api.py", line 127, in _achat_completion_stream
content = chunk.text
File "/Users/samsaha2/miniconda3/envs/metagpt/lib/python3.9/site-packages/google/generativeai/types/generation_types.py", line 328, in text
raise ValueError(
ValueError: The response.text
quick accessor only works for simple (single-Part
) text responses. This response is not simple text.Use the result.parts
accessor or the full result.candidates[index].content.parts
lookup instead.
I am excited to talk about Artificial General Intelligence (AGI). AGI is a hypothetical type of AI that would possess the ability to understand or learn any intellectual task that a human being can. It is a long-term goal of AI researchError in sys.excepthook:
Original exception was:
`
I'll add some codes to throw a ValueError
exception when gemini blocked the chat, in order to notify the external user.
It is running upto some extent at the end it is throwing error.
`python3 debate_simple.py 2024-03-25 19:33:36.210 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to /Users/samsaha2 2024-03-25 19:33:38.392 | INFO | metagpt.roles.role:_act:399 - Alex(Democratic candidate): to do Action(AlexSay) I am excited to talk about Artificial General Intelligence (AGI). AGI is a hypothetical type of AI that would possess the ability to understand or learn any intellectual task that a human being can. This is in contrast to narrow AI, which is designed to perform a specific task, such as playing chess or recognizing objects in images.
AGI is still a long way off, but it is a topic of great interest to researchers and scientists. If AGI can be achieved, it would have a profound impact on our world. It could revolutionize industries, create new jobs, and even help us to solve some of the world's most pressing problems.
Of course, there are also potential risks associated with AGI. If AGI is not developed responsibly, it could pose a threat to humanity. It is important to consider these risks and to develop safeguards to prevent them from happening.
I believe that AGI has the potential to be a great force for good in the world. However, it is important to proceed with caution and to ensure that AGI is developed in a responsible way. 2024-03-25 19:33:45.354 | INFO | metagpt.utils.cost_manager:update_cost:52 - Total running cost: $0.000 | Max budget: $10.000 | Current cost: $0.000, prompt_tokens: 43, completion_tokens: 220 2024-03-25 19:33:45.357 | INFO | metagpt.roles.role:_act:399 - Bob(Republican candidate): to do Action(BobSay) I am excited about the potential of AGI to revolutionize our world and solve some of the world's most pressing problems. However, I am also concerned about the potential risks associated with AGI. It is important to proceed with caution and to ensure that AGI is developed in a responsible way. 2024-03-25 19:33:48.836 | INFO | metagpt.utils.cost_manager:update_cost:52 - Total running cost: $0.000 | Max budget: $10.000 | Current cost: $0.000, prompt_tokens: 258, completion_tokens: 62 2024-03-25 19:33:48.839 | INFO | metagpt.roles.role:_act:399 - Alex(Democratic candidate): to do Action(AlexSay) I am excited about the potential of AGI to revolutionize our world and solve some of the world's most pressing problems. However, I am also concerned about the potential risks associated with AGI. It is important to proceed with caution and to ensure that AGI is developed in a responsible way. 2024-03-25 19:33:52.628 | INFO | metagpt.utils.cost_manager:update_cost:52 - Total running cost: $0.000 | Max budget: $10.000 | Current cost: $0.000, prompt_tokens: 339, completion_tokens: 62 2024-03-25 19:33:52.631 | INFO | metagpt.roles.role:_act:399 - Bob(Republican candidate): to do Action(BobSay) I am excited about the potential of AGI to revolutionize our world and solve some of the world's most pressing problems. However, I am also concerned about the potential risks associated with AGI. It is important to proceed with caution and to ensure that AGI is developed in a responsible way. 2024-03-25 19:33:56.414 | INFO | metagpt.utils.cost_manager:update_cost:52 - Total running cost: $0.000 | Max budget: $10.000 | Current cost: $0.000, prompt_tokens: 396, completion_tokens: 62 2024-03-25 19:33:56.417 | INFO | metagpt.roles.role:_act:399 - Alex(Democratic candidate): to do Action(AlexSay) I am excited about the potential of AGI to revolutionize our world and solve some of the world's most pressing problems. However, I am also concerned about the potential risks associated with AGI. It is important to proceed with caution and to ensure that AGI is developed in a responsible way. 2024-03-25 19:34:00.304 | INFO | metagpt.utils.cost_manager:update_cost:52 - Total running cost: $0.000 | Max budget: $10.000 | Current cost: $0.000, prompt_tokens: 477, completion_tokens: 62 Error in sys.excepthook:
Original exception was: `
I have added some logs and a BlockedPromptException
exception:
async def _achat_completion_stream(self, messages: list[dict], timeout: int = USE_CONFIG_TIMEOUT) -> str:
resp: AsyncGenerateContentResponse = await self.llm.generate_content_async(
**self._const_kwargs(messages, stream=True)
)
collected_content = []
async for chunk in resp:
try:
content = chunk.text
except Exception as e:
logger.warning(f"messages: {messages}\nerrors: {e}\n{BlockedPromptException(str(chunk))}")
raise BlockedPromptException(str(chunk))
log_llm_stream(content)
collected_content.append(content)
log_llm_stream("\n")
full_content = "".join(collected_content)
usage = await self.aget_usage(messages, full_content)
self._update_costs(usage)
return full_content