MetaGPT
MetaGPT copied to clipboard
Meet a tenacity.RetryError mistake using llama2 as chat model
I am using llama as my model and the deply it by llama-cpp-python.llama-cpp-python offers a web server which aims to act as a drop-in replacement for the OpenAI API. .I find that there is a mistake when code run for some time.The conmand is python startup.py "Write a cli snake game" --code_review True
.
I check llama logs,they are normal I think.
The mistakes and all logs are as follows.
2023-07-29 21:08:05.514 | INFO | metagpt.config:__init__:44 - Config loading done.
2023-07-29 21:08:06.185 | INFO | metagpt.software_company:invest:39 - Investment: $3.0.
2023-07-29 21:08:06.185 | INFO | metagpt.roles.role:_act:166 - Alice(Product Manager): ready to WritePRD
Please provide the missing information to complete the product requirements document.
---
## Original Requirements
The boss wants us to create a CLI snake game that can be played in the terminal. The game should have the following features:
## Product Goals
Our goals for this product are:
1. To provide an engaging and entertaining gaming experience for users.
2. To create a game that is easy to learn but challenging to master.
3. To design a game that can be played in the terminal, making it accessible to users who prefer a more traditional gaming experience.
## User Stories
As a user, I want to:
1. Be able to control the snake's movement using arrow keys or WASD keys.
2. See the game board and the snake's position in real-time.
3. Be able to eat food pellets to grow the snake and avoid colliding with walls or its own body.
4. Experience a sense of progression and achievement as I complete levels and unlock new skins for the snake.
5. Have the option to play against other players in a multiplayer mode.
## Competitive Analysis
We have analyzed several similar CLI games and identified the following strengths and weaknesses:
Strengths:
1. Easy to learn and pick up.
2. Provides a sense of progression and achievement.
3. Offers a variety of game modes and skins.
Weaknesses:
1. Limited customization options.
2. Lack of multiplayer features.
3. Outdated graphics and design.
## Competitive Quadrant Chart
Here is a quadrant chart comparing our product to its competitors:
```mermaid
quadrantChart
title Reach and engagement of campaigns
x-axis Low Reach --> High Reach
y-axis Low Engagement --> High Engagement
quadrant-1 We should expand
quadrant-2 Need to promote
quadrant-3 Re-evaluate
quadrant-4 May be improved
"Campaign: A": [0.3, 0.6]
"Campaign B": [0.45, 0.23]
"Campaign C": [0.57, 0.69]
"Campaign D": [0.78, 0.34]
"Campaign E": [0.40, 0.34]
"Campaign F": [0.35, 0.78]
"Our Target Product": [0.5, 0.6]
Requirement Analysis
The product should be a CLI snake game that can be played in the terminal. It should have the following requirements:
- The game should have a simple and intuitive control scheme.
- The game board should be displayed in real-time, with the snake's position and food pellets visible.
- The snake should move smoothly and respond to user input accurately.
- The game should have a variety of levels with increasing difficulty.
- The game should include multiplayer features, allowing users to play against each other.
- The game should have customization options, such as skins and backgrounds.
- The game should be accessible to users with disabilities.
Requirement Pool
Here are the requirements for our product, prioritized based on importance and feasibility:
P0: Create a simple and intuitive control scheme for the snake's movement. P1: Design a user-friendly game board that displays the snake's position and food pellets in real-time. P2: Implement smooth and accurate snake movement based on user input. P3: Develop a variety of levels with increasing difficulty to keep users engaged. P4: Include multiplayer features, allowing users to play against each other. P5: Add customization options, such as skins and backgrounds. P6: Ensure the game is accessible to users with disabilities.
UI Design draft
Here is a simple design for our CLI snake game:
Game Board:
_______
/ \
/ \
/___________\
| | | |
| | | |
|
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Please provide the missing information in the format specified above.
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Please provide the information in the format example.
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Warning: model not found. Using cl100k_base encoding.
You are a Product Manager, named Alice, your goal is Efficiently create a successful product, and the constraint is .
Warning: model not found. Using cl100k_base encoding.
Traceback (most recent call last):
File "/home/zjh/.conda/envs/metagpt/lib/python3.9/site-packages/tenacity-8.2.2-py3.9.egg/tenacity/_asyncio.py", line 50, in __call__
result = await fn(*args, **kwargs)
File "/home/zjh/MetaGPT/metagpt/actions/action.py", line 57, in _aask_v1
content = await self.llm.aask(prompt, system_msgs)
File "/home/zjh/MetaGPT/metagpt/provider/base_gpt_api.py", line 44, in aask
rsp = await self.acompletion_text(message, stream=True)
File "/home/zjh/MetaGPT/metagpt/provider/openai_api.py", line 32, in wrapper
return await f(*args, **kwargs)
File "/home/zjh/MetaGPT/metagpt/provider/openai_api.py", line 218, in acompletion_text
return await self._achat_completion_stream(messages)
File "/home/zjh/MetaGPT/metagpt/provider/openai_api.py", line 169, in _achat_completion_stream
usage = self._calc_usage(messages, full_reply_content)
File "/home/zjh/MetaGPT/metagpt/provider/openai_api.py", line 224, in _calc_usage
prompt_tokens = count_message_tokens(messages, self.model)
File "/home/zjh/MetaGPT/metagpt/utils/token_counter.py", line 55, in count_message_tokens
raise NotImplementedError(
NotImplementedError: num_tokens_from_messages() is not implemented for model gpt-3.5. See https://github.com/openai/openai-python/blob/main/chatml.md for information on how messages are converted to tokens.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/zjh/MetaGPT/startup.py", line 36, in <module>
fire.Fire(main)
File "/home/zjh/.conda/envs/metagpt/lib/python3.9/site-packages/fire-0.4.0-py3.9.egg/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/zjh/.conda/envs/metagpt/lib/python3.9/site-packages/fire-0.4.0-py3.9.egg/fire/core.py", line 466, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/zjh/.conda/envs/metagpt/lib/python3.9/site-packages/fire-0.4.0-py3.9.egg/fire/core.py", line 681, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/home/zjh/MetaGPT/startup.py", line 32, in main
asyncio.run(startup(idea, investment, n_round, code_review))
File "/home/zjh/.conda/envs/metagpt/lib/python3.9/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/home/zjh/.conda/envs/metagpt/lib/python3.9/asyncio/base_events.py", line 642, in run_until_complete
return future.result()
File "/home/zjh/MetaGPT/startup.py", line 20, in startup
await company.run(n_round=n_round)
File "/home/zjh/MetaGPT/metagpt/software_company.py", line 60, in run
await self.environment.run()
File "/home/zjh/MetaGPT/metagpt/environment.py", line 56, in run
await asyncio.gather(*futures)
File "/home/zjh/MetaGPT/metagpt/roles/role.py", line 239, in run
rsp = await self._react()
File "/home/zjh/MetaGPT/metagpt/roles/role.py", line 208, in _react
return await self._act()
File "/home/zjh/MetaGPT/metagpt/roles/role.py", line 167, in _act
response = await self._rc.todo.run(self._rc.important_memory)
File "/home/zjh/MetaGPT/metagpt/actions/write_prd.py", line 145, in run
prd = await self._aask_v1(prompt, "prd", OUTPUT_MAPPING)
File "/home/zjh/.conda/envs/metagpt/lib/python3.9/site-packages/tenacity-8.2.2-py3.9.egg/tenacity/_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
File "/home/zjh/.conda/envs/metagpt/lib/python3.9/site-packages/tenacity-8.2.2-py3.9.egg/tenacity/_asyncio.py", line 47, in __call__
do = self.iter(retry_state=retry_state)
File "/home/zjh/.conda/envs/metagpt/lib/python3.9/site-packages/tenacity-8.2.2-py3.9.egg/tenacity/__init__.py", line 326, in iter
raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x7ff56e57f0d0 state=finished raised NotImplementedError>]
@geekan @qa6300525
comment _calc_usage if needed
_calc_usage
_calc_usage
_calc_usage
When watching the logs, I see that it doesn't even reach out to my local server (I double checked e.g. I included the port etc)
Comment out the function _calc_usage, I mean
Some functions rely on OpenAI's return fields, but other models may not provide them.
@TinaTiel @StudyingLover @tianclll