An error occurred while running the code
I tried the offical example: website
(metagpt) PS F:\MetaGPT> python F:\MetaGPT\test.py
2024-06-06 18:34:00.043 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to F:\MetaGPT
2024-06-06 18:34:02.219 | INFO | metagpt.roles.role:_act:402 - A(Democratic candidate): to do Action(Say)
A: Climate change is a dire threat to our planet, and it's time for bold action. We must invest in renewable energy, reduce emissions, and protect our environment for future generations. It's not just about policy; it's about our moral responsibility to act.
B: I agree with A, but we also need to consider the economic impact. We can't just shut down industries overnight. We need a balanced approach that protects jobs and the environment.
C: Climate change is real, but it's not as urgent as some make it out to be. We should focus on economic growth and let the market drive innovation. If we grow our economy, we can afford to invest in clean energy later.
A: We can't afford to wait, C. The science is clear, and the time for action is now. We need to prioritize our environment and invest in a green future. It's not just about growth; it's about sustainability.
2024-06-06 18:34:07.873 | INFO | metagpt.utils.cost_manager:update_cost:57 - Total running cost: $0.003 | Max budget: $10.000 | Current cost: $0.003, prompt_tokens: 88, completion_tokens: 191
2024-06-06 18:34:07.875 | INFO | metagpt.roles.role:_act:402 - B(Republican candidate): to do Action(Say)
2024-06-06 18:34:07.879 | INFO | metagpt.roles.role:_act:402 - C(Voter): to do Action(Vote)
B: I hear the concerns, and I understand the urgency, but let's not forget that our economy is the lifeblood of our nation. We must find a way to transition to a greener future without leaving our workers behind. It's about creating new opportunities, not just replacing old ones. We need to invest in innovation that can make renewable energy cost-effective and accessible to all, ensuring that our environment is protected while our economy thrives. This is not just about survival; it's about prosperity for all Americans.
2024-06-06 18:34:10.638 | INFO | metagpt.utils.cost_manager:update_cost:57 - Total running cost: $0.005 | Max budget: $10.000 | Current cost: $0.005, prompt_tokens: 271, completion_tokens: 105
2024-06-06 18:34:11.010 | WARNING | metagpt.utils.common:wrapper:649 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory.
2024-06-06 18:34:11.017 | ERROR | metagpt.utils.common:wrapper:631 - Exception occurs, start to serialize the project, exp:
Traceback (most recent call last):
File "F:\MetaGPT\metagpt\utils\common.py", line 640, in wrapper
return await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\roles\role.py", line 550, in run
rsp = await self.react()
^^^^^^^^^^^^^^^^^^
openai.RateLimitError: Error code: 429 - {'error': {'message': 'Your account cpfvkr9p2k122r0f6b50
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "F:\MetaGPT\metagpt\utils\common.py", line 626, in wrapper
result = await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\team.py", line 131, in run
await self.env.run()
Exception: Traceback (most recent call last):
File "F:\MetaGPT\metagpt\utils\common.py", line 640, in wrapper
return await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\roles\role.py", line 550, in run
rsp = await self.react()
^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\roles\role.py", line 519, in react
rsp = await self._react()
^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\roles\role.py", line 474, in _react
rsp = await self._act()
^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\roles\role.py", line 403, in _act
response = await self.rc.todo.run(self.rc.history)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\actions\action.py", line 105, in run
return await self._run_action_node(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\actions\action.py", line 100, in run_action_node
return await self.node.fill(context=context, llm=self.llm)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\actions\action_node.py", line 504, in fill
return await self.simple_fill(schema=schema, mode=mode, images=images, timeout=timeout, exclude=exclude)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\actions\action_node.py", line 462, in simple_fill
self.content = await self.llm.aask(prompt)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\provider\base_llm.py", line 152, in aask
rsp = await self.acompletion_text(message, stream=stream, timeout=self.get_timeout(timeout))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\tenacity_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\tenacity_asyncio.py", line 47, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\tenacity_init.py", line 314, in iter
return fut.result()
^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\concurrent\futures_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\concurrent\futures_base.py", line 401, in __get_result
raise self._exception
File "F:\anacoda\envs\metagpt\Lib\site-packages\tenacity_asyncio.py", line 50, in call
result = await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\provider\openai_api.py", line 158, in acompletion_text
return await self._achat_completion_stream(messages, timeout=timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\MetaGPT\metagpt\provider\openai_api.py", line 90, in _achat_completion_stream
response: AsyncStream[ChatCompletionChunk] = await self.aclient.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\openai\resources\chat\completions.py", line 1181, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\openai_base_client.py", line 1790, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\openai_base_client.py", line 1493, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\openai_base_client.py", line 1569, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\openai_base_client.py", line 1615, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\openai_base_client.py", line 1569, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\openai_base_client.py", line 1615, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "F:\anacoda\envs\metagpt\Lib\site-packages\openai_base_client.py", line 1584, in _request
raise self._make_status_error_from_response(err.response) from None
openai.RateLimitError: Error code: 429 - {'error': {'message': 'Your account cpfvkr9p2k122r0f6b50
Try to check your api status
openai.RateLimitError: Error code: 429 - {'error': {'message': 'Your account cpfvkr9p2k122r0f6b50 request reached max request: 3, please try again after 1 seconds', 'type': 'rate_limit_reached_error'}}