MetaGPT
MetaGPT copied to clipboard
Unable to use metagpt with claude/anthropic.
Bug description I can't get metagpt to work with anthropic claude, configuration used:
llm:
api_type: "anthropic" # or azure / ollama / open_llm etc. Check LLMType for more options
model: "claude-3-sonnet-20240229" # or gpt-3.5-turbo-1106 / gpt-4-1106-preview
base_url: "https://api.anthropic.com/v1/messages" # or forward url / other llm url
api_key: "<key>"
Bug Error Unable to use metagpt with claude/anthropic.
Environment information Basic github codespace "2 core, 8Gb ram, 32GB" Python 3.11.4 model: claude-3-sonnet-20240229 api_type: anthropic base_url: https://api.anthropic.com/v1/messages
-
LLM type and model name: Claude (claude-3-sonnet-20240229) from Anthropic
-
System version:
Linux codespaces-3fb395 6.2.0-1019-azure #19~22.04.1-Ubuntu SMP Wed Jan 10 22:57:03 UTC 2024 x86_64 GNU/Linux -
Python version: 3.11.4
-
packages version: Idk, I thing latest or the requested versions
-
installation method: pip install metagpt
Screenshots or logs
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /home/vscode/.local/lib/python3.11/site-packages/metagpt/software_company.py:108 in startup │
│ │
│ 105 │ │ typer.echo("Missing argument 'IDEA'. Run 'metagpt --help' for more information." │
│ 106 │ │ raise typer.Exit() │
│ 107 │ │
│ ❱ 108 │ return generate_repo( │
│ 109 │ │ idea, │
│ 110 │ │ investment, │
│ 111 │ │ n_round, │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/metagpt/software_company.py:48 in generate_repo │
│ │
│ 45 │ │ company = Team(context=ctx) │
│ 46 │ │ company.hire( │
│ 47 │ │ │ [ │
│ ❱ 48 │ │ │ │ ProductManager(), │
│ 49 │ │ │ │ Architect(), │
│ 50 │ │ │ │ ProjectManager(), │
│ 51 │ │ │ ] │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/metagpt/roles/product_manager.py:34 in __init__ │
│ │
│ 31 │ todo_action: str = "" │
│ 32 │ │
│ 33 │ def __init__(self, **kwargs) -> None: │
│ ❱ 34 │ │ super().__init__(**kwargs) │
│ 35 │ │ │
│ 36 │ │ self.set_actions([PrepareDocuments, WritePRD]) │
│ 37 │ │ self._watch([UserRequirement, PrepareDocuments]) │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/pydantic/main.py:164 in __init__ │
│ │
│ 161 │ │ """ │
│ 162 │ │ # `__tracebackhide__` tells pytest and some other tools to omit this function fr │
│ 163 │ │ __tracebackhide__ = True │
│ ❱ 164 │ │ __pydantic_self__.__pydantic_validator__.validate_python(data, self_instance=__p │
│ 165 │ │
│ 166 │ # The following line sets a flag that we use to determine when `__init__` gets overr │
│ 167 │ __init__.__pydantic_base_init__ = True │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/metagpt/roles/role.py:167 in │
│ validate_role_extra │
│ │
│ 164 │ │
│ 165 │ @model_validator(mode="after") │
│ 166 │ def validate_role_extra(self): │
│ ❱ 167 │ │ self._process_role_extra() │
│ 168 │ │ return self │
│ 169 │ │
│ 170 │ def _process_role_extra(self): │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/metagpt/roles/role.py:177 in │
│ _process_role_extra │
│ │
│ 174 │ │ │ self.llm = HumanProvider(None) │
│ 175 │ │ │
│ 176 │ │ self._check_actions() │
│ ❱ 177 │ │ self.llm.system_prompt = self._get_prefix() │
│ 178 │ │ self._watch(kwargs.pop("watch", [UserRequirement])) │
│ 179 │ │ │
│ 180 │ │ if self.latest_observed_msg: │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/metagpt/context_mixin.py:95 in llm │
│ │
│ 92 │ │ """Role llm: if not existed, init from role.config""" │
│ 93 │ │ # print(f"class:{self.__class__.__name__}({self.name}), llm: {self._llm}, llm_co │
│ 94 │ │ if not self.private_llm: │
│ ❱ 95 │ │ │ self.private_llm = self.context.llm_with_cost_manager_from_llm_config(self.c │
│ 96 │ │ return self.private_llm │
│ 97 │ │
│ 98 │ @llm.setter │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/metagpt/context.py:94 in │
│ llm_with_cost_manager_from_llm_config │
│ │
│ 91 │ def llm_with_cost_manager_from_llm_config(self, llm_config: LLMConfig) -> BaseLLM: │
│ 92 │ │ """Return a LLM instance, fixme: support cache""" │
│ 93 │ │ # if self._llm is None: │
│ ❱ 94 │ │ llm = create_llm_instance(llm_config) │
│ 95 │ │ if llm.cost_manager is None: │
│ 96 │ │ │ llm.cost_manager = self.cost_manager │
│ 97 │ │ return llm │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/metagpt/provider/llm_provider_registry.py:36 in │
│ create_llm_instance │
│ │
│ 33 │
│ 34 def create_llm_instance(config: LLMConfig) -> BaseLLM: │
│ 35 │ """get the default llm provider""" │
│ ❱ 36 │ return LLM_REGISTRY.get_provider(config.api_type)(config) │
│ 37 │
│ 38 │
│ 39 # Registry instance │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/metagpt/provider/llm_provider_registry.py:21 in │
│ get_provider │
│ │
│ 18 │ │
│ 19 │ def get_provider(self, enum: LLMType): │
│ 20 │ │ """get provider instance according to the enum""" │
│ ❱ 21 │ │ return self.providers[enum] │
│ 22 │
│ 23 │
│ 24 def register_provider(key): │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
KeyError: <LLMType.ANTHROPIC: 'anthropic'>
anthropic is only available in main branch.
What version of Metagpt are you using?
@iorisa
Name: metagpt Version: 0.7.6 Summary: The Multi-Agent Framework Home-page: https://github.com/geekan/MetaGPT Author: Alexander Wu Author-email: [email protected] License: MIT Location: /home/vscode/.local/lib/python3.11/site-packages Requires: aiofiles, aiohttp, aioredis, anthropic, anytree, beautifulsoup4, channels, faiss-cpu, fire, gitignore-parser, gitpython, google-generativeai, imap-tools, ipykernel, ipython, ipywidgets, lancedb, langchain, libcst, loguru, meilisearch, nbclient, nbformat, networkx, numpy, openai, openpyxl, pandas, Pillow, playwright, pydantic, python-docx, PyYAML, qdrant-client, rich, scikit-learn, semantic-kernel, setuptools, socksio, ta, tenacity, tiktoken, tqdm, typer, typing-extensions, typing-inspect, websocket-client, websockets, wrapt, zhipuai Required-by:
@iorisa
I updated it to main branch, but now I have this problem:
2024-03-14 18:41:23.844 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to /workspaces/MetaGPT
/home/vscode/.local/lib/python3.11/site-packages/langchain/vectorstores/__init__.py:35: LangChainDeprecationWarning: Importing vector stores from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:
`from langchain_community.vectorstores import Chroma`.
To install langchain-community run `pip install -U langchain-community`.
warnings.warn(
/home/vscode/.local/lib/python3.11/site-packages/langchain/vectorstores/__init__.py:35: LangChainDeprecationWarning: Importing vector stores from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:
`from langchain_community.vectorstores import FAISS`.
To install langchain-community run `pip install -U langchain-community`.
warnings.warn(
2024-03-14 18:41:27.679 | INFO | metagpt.team:invest:90 - Investment: $3.0.
2024-03-14 18:41:27.681 | INFO | metagpt.roles.role:_act:397 - Alice(Product Manager): to do PrepareDocuments(PrepareDocuments)
2024-03-14 18:41:27.752 | INFO | metagpt.utils.file_repository:save:60 - save to: /workspaces/MetaGPT/workspace/20240314184127/docs/requirement.txt
2024-03-14 18:41:27.762 | INFO | metagpt.roles.role:_act:397 - Alice(Product Manager): to do WritePRD(WritePRD)
2024-03-14 18:41:27.765 | INFO | metagpt.actions.write_prd:run:86 - New requirement detected: <My Prompt>
2024-03-14 18:41:27.956 | ERROR | metagpt.utils.common:log_it:552 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 0.190(s), this was the 1st time calling it. exp: Error code: 404 - {'type': 'error', 'error': {'type': 'not_found_error', 'message': 'Not Found'}}
2024-03-14 18:41:28.235 | ERROR | metagpt.utils.common:log_it:552 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 0.469(s), this was the 2nd time calling it. exp: Error code: 404 - {'type': 'error', 'error': {'type': 'not_found_error', 'message': 'Not Found'}}
2024-03-14 18:41:29.730 | ERROR | metagpt.utils.common:log_it:552 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 1.964(s), this was the 3rd time calling it. exp: Error code: 404 - {'type': 'error', 'error': {'type': 'not_found_error', 'message': 'Not Found'}}
2024-03-14 18:41:29.884 | ERROR | metagpt.utils.common:log_it:552 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 2.117(s), this was the 4th time calling it. exp: Error code: 404 - {'type': 'error', 'error': {'type': 'not_found_error', 'message': 'Not Found'}}
Do you have any ideas on how to fix it? Btw, thanks so much for the last reply.
P.S. I already tried, to uninstall langchain-community, and reinstall it but it gives everytime the same error.
@AnonDevSUS We have removed langchain in the past two days. Can you try the latest main code again?
@AnonDevSUS
The base url is base_url: 'https://api.anthropic.com' like inside https://docs.deepwisdom.ai/main/en/guide/get_started/configuration.html#anthropic-claude-api, not your typed.
@better629 Can we solidify these default addresses?
I have updated the main code, and used "https://api.anthropic.com" as base url.
But it seems incompatible with Github Codespace?
$ metagpt "<my prompt>"
2024-03-21 11:51:23.880 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to /workspaces/MetaGPT
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /workspaces/MetaGPT/metagpt/software_company.py:108 in startup │
│ │
│ 105 │ │ typer.echo("Missing argument 'IDEA'. Run 'metagpt --help' for more information." │
│ 106 │ │ raise typer.Exit() │
│ 107 │ │
│ ❱ 108 │ return generate_repo( │
│ 109 │ │ idea, │
│ 110 │ │ investment, │
│ 111 │ │ n_round, │
│ │
│ /workspaces/MetaGPT/metagpt/software_company.py:32 in generate_repo │
│ │
│ 29 │ """Run the startup logic. Can be called from CLI or other Python scripts.""" │
│ 30 │ from metagpt.config2 import config │
│ 31 │ from metagpt.context import Context │
│ ❱ 32 │ from metagpt.roles import ( │
│ 33 │ │ Architect, │
│ 34 │ │ Engineer, │
│ 35 │ │ ProductManager, │
│ │
│ /workspaces/MetaGPT/metagpt/roles/__init__.py:9 in <module> │
│ │
│ 6 @File : __init__.py │
│ 7 """ │
│ 8 │
│ ❱ 9 from metagpt.roles.role import Role │
│ 10 from metagpt.roles.architect import Architect │
│ 11 from metagpt.roles.project_manager import ProjectManager │
│ 12 from metagpt.roles.product_manager import ProductManager │
│ │
│ /workspaces/MetaGPT/metagpt/roles/role.py:597 in <module> │
│ │
│ 594 │ │ return "" │
│ 595 │
│ 596 │
│ ❱ 597 RoleContext.model_rebuild() │
│ 598 │
│ │
│ /workspaces/MetaGPT/metagpt/roles/role.py:122 in model_rebuild │
│ │
│ 119 │ │
│ 120 │ @classmethod │
│ 121 │ def model_rebuild(cls, **kwargs): │
│ ❱ 122 │ │ from metagpt.environment.base_env import Environment # noqa: F401 │
│ 123 │ │ │
│ 124 │ │ super().model_rebuild(**kwargs) │
│ 125 │
│ │
│ /workspaces/MetaGPT/metagpt/environment/__init__.py:7 in <module> │
│ │
│ 4 │
│ 5 from metagpt.environment.base_env import Environment │
│ 6 from metagpt.environment.android_env.android_env import AndroidEnv │
│ ❱ 7 from metagpt.environment.mincraft_env.mincraft_env import MincraftExtEnv │
│ 8 from metagpt.environment.werewolf_env.werewolf_env import WerewolfEnv │
│ 9 from metagpt.environment.stanford_town_env.stanford_town_env import StanfordTownEnv │
│ 10 from metagpt.environment.software_env.software_env import SoftwareEnv │
│ │
│ /workspaces/MetaGPT/metagpt/environment/mincraft_env/mincraft_env.py:18 in <module> │
│ │
│ 15 from metagpt.environment.mincraft_env.const import MC_CKPT_DIR │
│ 16 from metagpt.environment.mincraft_env.mincraft_ext_env import MincraftExtEnv │
│ 17 from metagpt.logs import logger │
│ ❱ 18 from metagpt.rag.vector_stores.chroma import ChromaVectorStore │
│ 19 from metagpt.utils.common import load_mc_skills_code, read_json_file, write_json_file │
│ 20 │
│ 21 │
│ │
│ /workspaces/MetaGPT/metagpt/rag/vector_stores/chroma/__init__.py:1 in <module> │
│ │
│ ❱ 1 from metagpt.rag.vector_stores.chroma.base import ChromaVectorStore │
│ 2 │
│ 3 __all__ = ["ChromaVectorStore"] │
│ 4 │
│ │
│ /workspaces/MetaGPT/metagpt/rag/vector_stores/chroma/base.py:10 in <module> │
│ │
│ 7 import math │
│ 8 from typing import Any, Dict, Generator, List, Optional, cast │
│ 9 │
│ ❱ 10 import chromadb │
│ 11 from chromadb.api.models.Collection import Collection │
│ 12 from llama_index.core.bridge.pydantic import Field, PrivateAttr │
│ 13 from llama_index.core.schema import BaseNode, MetadataMode, TextNode │
│ │
│ /home/vscode/.local/lib/python3.11/site-packages/chromadb/__init__.py:79 in <module> │
│ │
│ 76 │ │ │ __import__("pysqlite3") │
│ 77 │ │ │ sys.modules["sqlite3"] = sys.modules.pop("pysqlite3") │
│ 78 │ │ else: │
│ ❱ 79 │ │ │ raise RuntimeError( │
│ 80 │ │ │ │ "\033[91mYour system has an unsupported version of sqlite3. Chroma \ │
│ 81 │ │ │ │ │ requires sqlite3 >= 3.35.0.\033[0m\n" │
│ 82 │ │ │ │ "\033[94mPlease visit \ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
RuntimeError: [91mYour system has an unsupported version of sqlite3. Chroma requires sqlite3 >=
3.35.0.[0m
[94mPlease visit https://docs.trychroma.com/troubleshooting#sqlite to learn how
@AnonDevSUS How did you install it? It seems to be a problem with chroma. I wonder if the installation of chroma was successful?
@geekan I sent these commands:
$ pip uninstall metagpt
$ git pull
$ pip install -e .
chroma did encounter some problems before. Maybe it would be better to consider removing this dependency and only doing the import when needed. @better629 What do you think?
@geekan add a independent requirements_rag.txt?
@AnonDevSUS what's your system and version?
@better629 I'm on 2 core github codespace, I retrieve some info about the system:
$ uname -a
Linux codespaces-3fb395 6.2.0-1019-azure #19~22.04.1-Ubuntu SMP Wed Jan 10 22:57:03 UTC 2024 x86_64 GNU/Linux
These are some lscpu info:
Model name: AMD EPYC 7763 64-Core Processor
CPU(s): 2
Architecture: x86_64
CPU op-mode(s): 32-bit, 64-bit
$ pip -V
pip 24.0 from /home/vscode/.local/lib/python3.11/site-packages/pip (python 3.11)
$ python -V
Python 3.11.4
@AnonDevSUS Chroma requires SQLite > 3.35.
- check your sqlite version
(metagpt) MacBook-Pro:MetaGPT xxx$ sqlite3 --version
3.41.2 2023-03-22 11:56:21 0d1fc92f94cb6b7xxxx
(metagpt) MacBook-Pro:MetaGPT xxx$ python3
>>> import sqlite3
>>> sqlite3.version
'2.6.0'
>>> sqlite3.sqlite_version
'3.41.2'
- update sqlite version
apt-get remove -y --auto-remove sqlite3
apt -y install sqlite3 libsqlite3-dev
and check the version
you can try it first~
@better629 There are the outputs:
Before sending "apt-get remove -y --auto-remove sqlite3 and apt -y install sqlite3 libsqlite3-dev"
$ sqlite3 --version
bash: sqlite3: command not found
$ python3
Python 3.11.4 (main, Jun 7 2023, 18:32:58) [GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import sqlite3
>>> sqlite3.version
'2.6.0'
>>> sqlite3.sqlite_version
'3.34.1'
Trying to uninstall sqlite3 and install sqlite3 and libsqlite3-dev:
$ sudo apt-get remove -y --auto-remove sqlite3
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
E: Unable to locate package sqlite3
$ sudo apt -y install sqlite3 libsqlite3-dev
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
E: Unable to locate package sqlite3
$ sudo apt update
Get:1 http://deb.debian.org/debian bullseye InRelease [116 kB]
Get:2 http://deb.debian.org/debian-security bullseye-security InRelease [48.4 kB]
Get:3 http://deb.debian.org/debian bullseye-updates InRelease [44.1 kB]
Get:4 https://dl.yarnpkg.com/debian stable InRelease [17.1 kB]
Get:5 http://deb.debian.org/debian bullseye/main amd64 Packages [8068 kB]
Get:6 http://deb.debian.org/debian-security bullseye-security/main amd64 Packages [270 kB]
Get:7 http://deb.debian.org/debian bullseye-updates/main amd64 Packages [18.8 kB]
Get:8 https://dl.yarnpkg.com/debian stable/main amd64 Packages [10.9 kB]
Get:9 https://dl.yarnpkg.com/debian stable/main all Packages [10.9 kB]
Fetched 8604 kB in 1s (5845 kB/s)
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
78 packages can be upgraded. Run 'apt list --upgradable' to see them.
$ sudo apt -y install sqlite3 libsqlite3-dev
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
libsqlite3-dev is already the newest version (3.34.1-3).
Suggested packages:
sqlite3-doc
The following NEW packages will be installed:
sqlite3
0 upgraded, 1 newly installed, 0 to remove and 78 not upgraded.
Need to get 1201 kB of archives.
After this operation, 3155 kB of additional disk space will be used.
Get:1 http://deb.debian.org/debian bullseye/main amd64 sqlite3 amd64 3.34.1-3 [1201 kB]
Fetched 1201 kB in 0s (44.7 MB/s)
Selecting previously unselected package sqlite3.
(Reading database ... 28352 files and directories currently installed.)
Preparing to unpack .../sqlite3_3.34.1-3_amd64.deb ...
Unpacking sqlite3 (3.34.1-3) ...
Setting up sqlite3 (3.34.1-3) ...
Processing triggers for man-db (2.9.4-2) ...
After sending "apt -y install sqlite3 libsqlite3-dev":
$ sqlite3 --version
3.34.1 2021-01-20 14:10:07 10e20c0b43500cfb9bbc0eaa061c57514f715d87238f4d835880cd846b9ealt1
$ python3
Python 3.11.4 (main, Jun 7 2023, 18:32:58) [GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import sqlite3
>>> sqlite3.version
'2.6.0'
>>> sqlite3.sqlite_version
'3.34.1'
Seems the system default sqlite3 version is low, can you try this https://mosdia.com/blog/install-sqlite3-linux/ to install a higher Precompiled Binaries version.
@better629 Sorry for the delay in replying but I've been busy.
So precompiled sqlite3 doesn't work because it requires GLIBC_2.33 and GLIBC_2.34. I tried to update it but the latest version for the OS is GLIBC_2.31.
$ ./sqlite3 --version
./sqlite3: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.33' not found (required by ./sqlite3)
./sqlite3: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by ./sqlite3)
$ strings /lib/x86_64-linux-gnu/libc.so.6 | grep GLIBC
GLIBC_2.2.5
GLIBC_2.2.6
GLIBC_2.3
GLIBC_2.3.2
GLIBC_2.3.3
GLIBC_2.3.4
GLIBC_2.4
GLIBC_2.5
GLIBC_2.6
GLIBC_2.7
GLIBC_2.8
GLIBC_2.9
GLIBC_2.10
GLIBC_2.11
GLIBC_2.12
GLIBC_2.13
GLIBC_2.14
GLIBC_2.15
GLIBC_2.16
GLIBC_2.17
GLIBC_2.18
GLIBC_2.22
GLIBC_2.23
GLIBC_2.24
GLIBC_2.25
GLIBC_2.26
GLIBC_2.27
GLIBC_2.28
GLIBC_2.29
GLIBC_2.30
GLIBC_PRIVATE
GNU C Library (Debian GLIBC 2.31-13+deb11u8) stable release version 2.31.
I also tried to compile them, but in output I only receive the sqlite3 file and 2 others are missing (sqlite3_analyzer and sqldiff). Now I'm trying to update metagpt to see if it fixed it.
It seems to work, sometimes it shows this error:
[/CONTENT]
2024-03-30 11:12:58.789 | WARNING | metagpt.utils.cost_manager:update_cost:49 - Model claude-3-haiku-20240307 not found in TOKEN_COSTS.
2024-03-30 11:12:58.794 | ERROR | metagpt.utils.common:log_it:554 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 10.326(s), this was the 1st time calling it. exp: 1 validation error for WritePRD_AN
Value error, Missing fields: {'Anything UNCLEAR'} [type=value_error, input_value={'Language': 'en_us', 'Pr...h gameplay experience."}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.5/v/value_error
After a while it gave me this error and stopped:
The `UI` class follows the design specified in the context, and it can be used by the `Game` class to update and display the game's UI elements.
2024-03-30 11:16:34.742 | WARNING | metagpt.utils.cost_manager:update_cost:49 - Model claude-3-haiku-20240307 not found in TOKEN_COSTS.
2024-03-30 11:16:34.743 | INFO | metagpt.actions.write_code_review:run:175 - Code review and rewrite ui.py: 1/2 | len(iterative_code)=1695, len(self.i_context.code_doc.content)=1695
2024-03-30 11:16:59.529 | WARNING | metagpt.utils.common:wrapper:649 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory.
2024-03-30 11:16:59.558 | ERROR | metagpt.utils.common:wrapper:631 - Exception occurs, start to serialize the project, exp:
Traceback (most recent call last):
File "/home/vscode/.local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in __call__
result = await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/actions/write_code_review.py", line 127, in write_code_review_and_rewrite
cr_rsp = await self._aask(context_prompt + cr_prompt)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
anthropic.RateLimitError: Error code: 429 - {'type': 'error', 'error': {'type': 'rate_limit_error', 'message': 'Number of requests has exceeded your rate limit (https://docs.anthropic.com/claude/reference/rate-limits). Please try again later or contact sales at https://www.anthropic.com/contact-sales to discuss your options for a rate limit increase.'}}
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/workspaces/MetaGPT/metagpt/utils/common.py", line 640, in wrapper
return await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/roles/role.py", line 550, in run
rsp = await self.react()
^^^^^^^^^^^^^^^^^^
tenacity.RetryError: RetryError[<Future at 0x7fda681c11d0 state=finished raised RateLimitError>]
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/workspaces/MetaGPT/metagpt/utils/common.py", line 626, in wrapper
result = await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/team.py", line 134, in run
await self.env.run()
Exception: Traceback (most recent call last):
File "/home/vscode/.local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in __call__
result = await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/actions/write_code_review.py", line 127, in write_code_review_and_rewrite
cr_rsp = await self._aask(context_prompt + cr_prompt)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/actions/action.py", line 93, in _aask
return await self.llm.aask(prompt, system_msgs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/provider/base_llm.py", line 150, in aask
rsp = await self.acompletion_text(message, stream=stream, timeout=self.get_timeout(timeout))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 47, in __call__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/tenacity/__init__.py", line 314, in iter
return fut.result()
^^^^^^^^^^^^
File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/vscode/.local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 50, in __call__
result = await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/provider/base_llm.py", line 200, in acompletion_text
return await self._achat_completion_stream(messages, timeout=self.get_timeout(timeout))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/provider/anthropic_api.py", line 54, in _achat_completion_stream
stream = await self.aclient.messages.create(**self._const_kwargs(messages, stream=True))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/anthropic/resources/messages.py", line 1364, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1728, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1431, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1507, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1553, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1507, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1553, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1522, in _request
raise self._make_status_error_from_response(err.response) from None
anthropic.RateLimitError: Error code: 429 - {'type': 'error', 'error': {'type': 'rate_limit_error', 'message': 'Number of requests has exceeded your rate limit (https://docs.anthropic.com/claude/reference/rate-limits). Please try again later or contact sales at https://www.anthropic.com/contact-sales to discuss your options for a rate limit increase.'}}
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/workspaces/MetaGPT/metagpt/utils/common.py", line 640, in wrapper
return await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/roles/role.py", line 550, in run
rsp = await self.react()
^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/roles/role.py", line 517, in react
rsp = await self._react()
^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/roles/role.py", line 463, in _react
rsp = await self._act()
^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/roles/engineer.py", line 148, in _act
return await self._act_write_code()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/roles/engineer.py", line 155, in _act_write_code
changed_files = await self._act_sp_with_cr(review=self.use_code_review)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/roles/engineer.py", line 116, in _act_sp_with_cr
coding_context = await action.run()
^^^^^^^^^^^^^^^^^^
File "/workspaces/MetaGPT/metagpt/actions/write_code_review.py", line 179, in run
result, rewrited_code = await self.write_code_review_and_rewrite(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/tenacity/_asyncio.py", line 47, in __call__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vscode/.local/lib/python3.11/site-packages/tenacity/__init__.py", line 326, in iter
raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x7fda681c11d0 state=finished raised RateLimitError>]
It seems to work, sometimes it shows this error:
[/CONTENT] 2024-03-30 11:12:58.789 | WARNING | metagpt.utils.cost_manager:update_cost:49 - Model claude-3-haiku-20240307 not found in TOKEN_COSTS. 2024-03-30 11:12:58.794 | ERROR | metagpt.utils.common:log_it:554 - Finished call to 'metagpt.actions.action_node.ActionNode._aask_v1' after 10.326(s), this was the 1st time calling it. exp: 1 validation error for WritePRD_AN Value error, Missing fields: {'Anything UNCLEAR'} [type=value_error, input_value={'Language': 'en_us', 'Pr...h gameplay experience."}, input_type=dict] For further information visit https://errors.pydantic.dev/2.5/v/value_error
@AnonDevSUS it's caused by the llm can't follow the prompt instruction sometimes, maybe you can try more times.
The rate_limit_error is due to api limitation, although we have add a exponential order retry-strategy, maybe the rate-limite is too strict.
We suggest try the openai api firstly, then other llms~