agent-zero
agent-zero copied to clipboard
Requirements installation fail due to dependency-conflicts
Clean installation with new env (using conda) Here is the log:
(agent-zero) ➜ agent-zero git:(main) pip install -r requirements.txt
Collecting ansio==0.0.1 (from -r requirements.txt (line 1))
Downloading ansio-0.0.1-py3-none-any.whl.metadata (739 bytes)
Collecting python-dotenv==1.0.1 (from -r requirements.txt (line 2))
Using cached python_dotenv-1.0.1-py3-none-any.whl.metadata (23 kB)
Collecting langchain-groq==0.1.6 (from -r requirements.txt (line 3))
Downloading langchain_groq-0.1.6-py3-none-any.whl.metadata (2.8 kB)
Collecting langchain-huggingface==0.0.3 (from -r requirements.txt (line 4))
Downloading langchain_huggingface-0.0.3-py3-none-any.whl.metadata (1.2 kB)
Collecting langchain-openai==0.1.15 (from -r requirements.txt (line 5))
Downloading langchain_openai-0.1.15-py3-none-any.whl.metadata (2.5 kB)
Collecting langchain-community==0.2.7 (from -r requirements.txt (line 6))
Downloading langchain_community-0.2.7-py3-none-any.whl.metadata (2.5 kB)
Collecting langchain-anthropic==0.1.19 (from -r requirements.txt (line 7))
Downloading langchain_anthropic-0.1.19-py3-none-any.whl.metadata (2.1 kB)
Collecting langchain-chroma==0.1.2 (from -r requirements.txt (line 8))
Downloading langchain_chroma-0.1.2-py3-none-any.whl.metadata (1.3 kB)
Collecting langchain-google-genai==1.0.7 (from -r requirements.txt (line 9))
Downloading langchain_google_genai-1.0.7-py3-none-any.whl.metadata (3.8 kB)
Collecting webcolors==24.6.0 (from -r requirements.txt (line 10))
Downloading webcolors-24.6.0-py3-none-any.whl.metadata (2.6 kB)
Collecting sentence-transformers==3.0.1 (from -r requirements.txt (line 11))
Downloading sentence_transformers-3.0.1-py3-none-any.whl.metadata (10 kB)
Collecting docker==7.1.0 (from -r requirements.txt (line 12))
Downloading docker-7.1.0-py3-none-any.whl.metadata (3.8 kB)
Collecting paramiko==3.4.0 (from -r requirements.txt (line 13))
Downloading paramiko-3.4.0-py3-none-any.whl.metadata (4.4 kB)
Collecting duckduckgo_search==6.1.12 (from -r requirements.txt (line 14))
Downloading duckduckgo_search-6.1.12-py3-none-any.whl.metadata (24 kB)
Collecting inputimeout==1.0.4 (from -r requirements.txt (line 15))
Downloading inputimeout-1.0.4-py3-none-any.whl.metadata (2.2 kB)
Collecting groq<1,>=0.4.1 (from langchain-groq==0.1.6->-r requirements.txt (line 3))
Downloading groq-0.9.0-py3-none-any.whl.metadata (13 kB)
Collecting langchain-core<0.3,>=0.2.2 (from langchain-groq==0.1.6->-r requirements.txt (line 3))
Downloading langchain_core-0.2.24-py3-none-any.whl.metadata (6.2 kB)
Collecting huggingface-hub>=0.23.0 (from langchain-huggingface==0.0.3->-r requirements.txt (line 4))
Downloading huggingface_hub-0.24.3-py3-none-any.whl.metadata (13 kB)
Collecting tokenizers>=0.19.1 (from langchain-huggingface==0.0.3->-r requirements.txt (line 4))
Downloading tokenizers-0.19.1-cp312-cp312-macosx_10_12_x86_64.whl.metadata (6.7 kB)
Collecting transformers>=4.39.0 (from langchain-huggingface==0.0.3->-r requirements.txt (line 4))
Downloading transformers-4.43.3-py3-none-any.whl.metadata (43 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 43.7/43.7 kB 752.2 kB/s eta 0:00:00
Collecting openai<2.0.0,>=1.32.0 (from langchain-openai==0.1.15->-r requirements.txt (line 5))
Downloading openai-1.37.1-py3-none-any.whl.metadata (22 kB)
Collecting tiktoken<1,>=0.7 (from langchain-openai==0.1.15->-r requirements.txt (line 5))
Downloading tiktoken-0.7.0-cp312-cp312-macosx_10_9_x86_64.whl.metadata (6.6 kB)
Collecting PyYAML>=5.3 (from langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl.metadata (2.1 kB)
Collecting SQLAlchemy<3,>=1.4 (from langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading SQLAlchemy-2.0.31-cp312-cp312-macosx_10_9_x86_64.whl.metadata (9.6 kB)
Collecting aiohttp<4.0.0,>=3.8.3 (from langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading aiohttp-3.9.5-cp312-cp312-macosx_10_9_x86_64.whl.metadata (7.5 kB)
Collecting dataclasses-json<0.7,>=0.5.7 (from langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading dataclasses_json-0.6.7-py3-none-any.whl.metadata (25 kB)
Collecting langchain<0.3.0,>=0.2.7 (from langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading langchain-0.2.11-py3-none-any.whl.metadata (7.1 kB)
Collecting langsmith<0.2.0,>=0.1.0 (from langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading langsmith-0.1.94-py3-none-any.whl.metadata (13 kB)
Collecting numpy<2.0.0,>=1.26.0 (from langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading numpy-1.26.4-cp312-cp312-macosx_10_9_x86_64.whl.metadata (61 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.1/61.1 kB 1.6 MB/s eta 0:00:00
Collecting requests<3,>=2 (from langchain-community==0.2.7->-r requirements.txt (line 6))
Using cached requests-2.32.3-py3-none-any.whl.metadata (4.6 kB)
Collecting tenacity!=8.4.0,<9.0.0,>=8.1.0 (from langchain-community==0.2.7->-r requirements.txt (line 6))
Using cached tenacity-8.5.0-py3-none-any.whl.metadata (1.2 kB)
Collecting anthropic<1,>=0.28.0 (from langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Downloading anthropic-0.32.0-py3-none-any.whl.metadata (18 kB)
Collecting defusedxml<0.8.0,>=0.7.1 (from langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Downloading defusedxml-0.7.1-py2.py3-none-any.whl.metadata (32 kB)
Collecting chromadb<0.6.0,>=0.4.0 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chromadb-0.5.5-py3-none-any.whl.metadata (6.8 kB)
Collecting fastapi<1,>=0.95.2 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading fastapi-0.111.1-py3-none-any.whl.metadata (26 kB)
Collecting google-generativeai<0.8.0,>=0.7.0 (from langchain-google-genai==1.0.7->-r requirements.txt (line 9))
Downloading google_generativeai-0.7.2-py3-none-any.whl.metadata (4.0 kB)
Collecting tqdm (from sentence-transformers==3.0.1->-r requirements.txt (line 11))
Downloading tqdm-4.66.4-py3-none-any.whl.metadata (57 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.6/57.6 kB 1.1 MB/s eta 0:00:00
Collecting torch>=1.11.0 (from sentence-transformers==3.0.1->-r requirements.txt (line 11))
Downloading torch-2.2.2-cp312-none-macosx_10_9_x86_64.whl.metadata (25 kB)
Collecting scikit-learn (from sentence-transformers==3.0.1->-r requirements.txt (line 11))
Using cached scikit_learn-1.5.1-cp312-cp312-macosx_10_9_x86_64.whl.metadata (12 kB)
Collecting scipy (from sentence-transformers==3.0.1->-r requirements.txt (line 11))
Using cached scipy-1.14.0-cp312-cp312-macosx_10_9_x86_64.whl.metadata (60 kB)
Collecting Pillow (from sentence-transformers==3.0.1->-r requirements.txt (line 11))
Downloading pillow-10.4.0-cp312-cp312-macosx_10_10_x86_64.whl.metadata (9.2 kB)
Collecting urllib3>=1.26.0 (from docker==7.1.0->-r requirements.txt (line 12))
Using cached urllib3-2.2.2-py3-none-any.whl.metadata (6.4 kB)
Collecting bcrypt>=3.2 (from paramiko==3.4.0->-r requirements.txt (line 13))
Downloading bcrypt-4.2.0-cp39-abi3-macosx_10_12_universal2.whl.metadata (9.6 kB)
Collecting cryptography>=3.3 (from paramiko==3.4.0->-r requirements.txt (line 13))
Downloading cryptography-43.0.0-cp39-abi3-macosx_10_9_universal2.whl.metadata (5.4 kB)
Collecting pynacl>=1.5 (from paramiko==3.4.0->-r requirements.txt (line 13))
Using cached PyNaCl-1.5.0-cp36-abi3-macosx_10_10_universal2.whl.metadata (8.7 kB)
Collecting click>=8.1.7 (from duckduckgo_search==6.1.12->-r requirements.txt (line 14))
Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
Collecting pyreqwest-impersonate>=0.4.9 (from duckduckgo_search==6.1.12->-r requirements.txt (line 14))
Downloading pyreqwest_impersonate-0.5.3-cp38-abi3-macosx_10_12_x86_64.whl.metadata (10 kB)
Collecting aiosignal>=1.1.2 (from aiohttp<4.0.0,>=3.8.3->langchain-community==0.2.7->-r requirements.txt (line 6))
Using cached aiosignal-1.3.1-py3-none-any.whl.metadata (4.0 kB)
Collecting attrs>=17.3.0 (from aiohttp<4.0.0,>=3.8.3->langchain-community==0.2.7->-r requirements.txt (line 6))
Using cached attrs-23.2.0-py3-none-any.whl.metadata (9.5 kB)
Collecting frozenlist>=1.1.1 (from aiohttp<4.0.0,>=3.8.3->langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading frozenlist-1.4.1-cp312-cp312-macosx_10_9_x86_64.whl.metadata (12 kB)
Collecting multidict<7.0,>=4.5 (from aiohttp<4.0.0,>=3.8.3->langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading multidict-6.0.5-cp312-cp312-macosx_10_9_x86_64.whl.metadata (4.2 kB)
Collecting yarl<2.0,>=1.0 (from aiohttp<4.0.0,>=3.8.3->langchain-community==0.2.7->-r requirements.txt (line 6))
Downloading yarl-1.9.4-cp312-cp312-macosx_10_9_x86_64.whl.metadata (31 kB)
Collecting anyio<5,>=3.5.0 (from anthropic<1,>=0.28.0->langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Downloading anyio-4.4.0-py3-none-any.whl.metadata (4.6 kB)
Collecting distro<2,>=1.7.0 (from anthropic<1,>=0.28.0->langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Using cached distro-1.9.0-py3-none-any.whl.metadata (6.8 kB)
Collecting httpx<1,>=0.23.0 (from anthropic<1,>=0.28.0->langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Using cached httpx-0.27.0-py3-none-any.whl.metadata (7.2 kB)
Collecting jiter<1,>=0.4.0 (from anthropic<1,>=0.28.0->langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Downloading jiter-0.5.0-cp312-cp312-macosx_10_12_x86_64.whl.metadata (3.6 kB)
Collecting pydantic<3,>=1.9.0 (from anthropic<1,>=0.28.0->langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Downloading pydantic-2.8.2-py3-none-any.whl.metadata (125 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 125.2/125.2 kB 1.4 MB/s eta 0:00:00
Collecting sniffio (from anthropic<1,>=0.28.0->langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Using cached sniffio-1.3.1-py3-none-any.whl.metadata (3.9 kB)
Collecting typing-extensions<5,>=4.7 (from anthropic<1,>=0.28.0->langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Downloading typing_extensions-4.12.2-py3-none-any.whl.metadata (3.0 kB)
Collecting build>=1.0.3 (from chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading build-1.2.1-py3-none-any.whl.metadata (4.3 kB)
Collecting chroma-hnswlib==0.7.6 (from chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chroma_hnswlib-0.7.6-cp312-cp312-macosx_10_9_x86_64.whl.metadata (252 bytes)
Collecting uvicorn>=0.18.3 (from uvicorn[standard]>=0.18.3->chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading uvicorn-0.30.3-py3-none-any.whl.metadata (6.5 kB)
Collecting posthog>=2.4.0 (from chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading posthog-3.5.0-py2.py3-none-any.whl.metadata (2.0 kB)
INFO: pip is looking at multiple versions of chromadb to determine which version is compatible with other requirements. This could take a while.
Collecting chromadb<0.6.0,>=0.4.0 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chromadb-0.5.4-py3-none-any.whl.metadata (6.8 kB)
Collecting chroma-hnswlib==0.7.5 (from chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chroma_hnswlib-0.7.5-cp312-cp312-macosx_10_9_x86_64.whl.metadata (252 bytes)
Collecting chromadb<0.6.0,>=0.4.0 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chromadb-0.5.3-py3-none-any.whl.metadata (6.8 kB)
Collecting chroma-hnswlib==0.7.3 (from chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chroma-hnswlib-0.7.3.tar.gz (31 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting chromadb<0.6.0,>=0.4.0 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chromadb-0.5.2-py3-none-any.whl.metadata (6.8 kB)
Downloading chromadb-0.5.1-py3-none-any.whl.metadata (6.8 kB)
Downloading chromadb-0.5.0-py3-none-any.whl.metadata (7.3 kB)
Downloading chromadb-0.4.24-py3-none-any.whl.metadata (7.3 kB)
Collecting pulsar-client>=3.1.0 (from chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading pulsar_client-3.5.0-cp312-cp312-macosx_10_15_universal2.whl.metadata (1.0 kB)
Collecting chromadb<0.6.0,>=0.4.0 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Using cached chromadb-0.4.23-py3-none-any.whl.metadata (7.3 kB)
INFO: pip is still looking at multiple versions of chromadb to determine which version is compatible with other requirements. This could take a while.
Downloading chromadb-0.4.22-py3-none-any.whl.metadata (7.3 kB)
Downloading chromadb-0.4.21-py3-none-any.whl.metadata (7.3 kB)
Downloading chromadb-0.4.20-py3-none-any.whl.metadata (7.3 kB)
Downloading chromadb-0.4.19-py3-none-any.whl.metadata (7.3 kB)
Downloading chromadb-0.4.18-py3-none-any.whl.metadata (7.4 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
Downloading chromadb-0.4.17-py3-none-any.whl.metadata (7.3 kB)
Downloading chromadb-0.4.16-py3-none-any.whl.metadata (7.3 kB)
Downloading chromadb-0.4.15-py3-none-any.whl.metadata (7.2 kB)
Downloading chromadb-0.4.14-py3-none-any.whl.metadata (7.0 kB)
Downloading chromadb-0.4.13-py3-none-any.whl.metadata (7.0 kB)
Downloading chromadb-0.4.12-py3-none-any.whl.metadata (7.0 kB)
Collecting pydantic<3,>=1.9.0 (from anthropic<1,>=0.28.0->langchain-anthropic==0.1.19->-r requirements.txt (line 7))
Downloading pydantic-1.10.17-cp312-cp312-macosx_10_9_x86_64.whl.metadata (151 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 151.6/151.6 kB 1.6 MB/s eta 0:00:00
Collecting fastapi<1,>=0.95.2 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading fastapi-0.99.1-py3-none-any.whl.metadata (23 kB)
Collecting chromadb<0.6.0,>=0.4.0 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chromadb-0.4.11-py3-none-any.whl.metadata (7.0 kB)
Downloading chromadb-0.4.10-py3-none-any.whl.metadata (7.0 kB)
Downloading chromadb-0.4.9-py3-none-any.whl.metadata (7.0 kB)
Collecting chroma-hnswlib==0.7.2 (from chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chroma-hnswlib-0.7.2.tar.gz (31 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting chromadb<0.6.0,>=0.4.0 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chromadb-0.4.8-py3-none-any.whl.metadata (6.9 kB)
Downloading chromadb-0.4.7-py3-none-any.whl.metadata (6.9 kB)
Downloading chromadb-0.4.6-py3-none-any.whl.metadata (6.8 kB)
Downloading chromadb-0.4.5-py3-none-any.whl.metadata (6.8 kB)
Downloading chromadb-0.4.4-py3-none-any.whl.metadata (6.8 kB)
Downloading chromadb-0.4.3-py3-none-any.whl.metadata (6.9 kB)
Collecting pandas>=1.3 (from chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Using cached pandas-2.2.2-cp312-cp312-macosx_10_9_x86_64.whl.metadata (19 kB)
Collecting chroma-hnswlib==0.7.1 (from chromadb<0.6.0,>=0.4.0->langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chroma-hnswlib-0.7.1.tar.gz (30 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting chromadb<0.6.0,>=0.4.0 (from langchain-chroma==0.1.2->-r requirements.txt (line 8))
Downloading chromadb-0.4.2-py3-none-any.whl.metadata (6.9 kB)
Downloading chromadb-0.4.1-py3-none-any.whl.metadata (6.9 kB)
Downloading chromadb-0.4.0-py3-none-any.whl.metadata (6.9 kB)
ERROR: Cannot install langchain-chroma because these package versions have conflicting dependencies.
The conflict is caused by:
chromadb 0.5.5 depends on onnxruntime>=1.14.1
chromadb 0.5.4 depends on onnxruntime>=1.14.1
chromadb 0.5.3 depends on onnxruntime>=1.14.1
chromadb 0.5.2 depends on onnxruntime>=1.14.1
chromadb 0.5.1 depends on onnxruntime>=1.14.1
chromadb 0.5.0 depends on onnxruntime>=1.14.1
chromadb 0.4.24 depends on onnxruntime>=1.14.1
chromadb 0.4.23 depends on onnxruntime>=1.14.1
chromadb 0.4.22 depends on onnxruntime>=1.14.1
chromadb 0.4.21 depends on onnxruntime>=1.14.1
chromadb 0.4.20 depends on onnxruntime>=1.14.1
chromadb 0.4.19 depends on onnxruntime>=1.14.1
chromadb 0.4.18 depends on onnxruntime>=1.14.1
chromadb 0.4.17 depends on onnxruntime>=1.14.1
chromadb 0.4.16 depends on onnxruntime>=1.14.1
chromadb 0.4.15 depends on onnxruntime>=1.14.1
chromadb 0.4.14 depends on onnxruntime>=1.14.1
chromadb 0.4.13 depends on onnxruntime>=1.14.1
chromadb 0.4.12 depends on onnxruntime>=1.14.1
chromadb 0.4.11 depends on onnxruntime>=1.14.1
chromadb 0.4.10 depends on onnxruntime>=1.14.1
chromadb 0.4.9 depends on onnxruntime>=1.14.1
chromadb 0.4.8 depends on onnxruntime>=1.14.1
chromadb 0.4.7 depends on onnxruntime>=1.14.1
chromadb 0.4.6 depends on onnxruntime>=1.14.1
chromadb 0.4.5 depends on onnxruntime>=1.14.1
chromadb 0.4.4 depends on onnxruntime>=1.14.1
chromadb 0.4.3 depends on onnxruntime>=1.14.1
chromadb 0.4.2 depends on onnxruntime>=1.14.1
chromadb 0.4.1 depends on onnxruntime>=1.14.1
chromadb 0.4.0 depends on onnxruntime>=1.14.1
To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict
ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
Try installing onnxruntime directly, you may be on a linux distribution that the library doesnt support.
Running into a similar same issue on what I believe to be a fresh installation of Ubuntu 22.04:
Building wheels for collected packages: pypika
Building wheel for pypika (pyproject.toml) ... done
Created wheel for pypika: filename=PyPika-0.48.9-py2.py3-none-any.whl size=53738 sha256=9bfff9dd4ef29a48c901e2370b6317ef53f4c2a8836e4c32c3fb739acadba59f
Stored in directory: /home/hvanmegen/.cache/pip/wheels/e1/26/51/d0bffb3d2fd82256676d7ad3003faea3bd6dddc9577af665f4
Successfully built pypika
Installing collected packages: pypika, mpmath, monotonic, mmh3, flatbuffers, wrapt, websockets, webcolors, uvloop, uritemplate, typing-extensions, tenacity, sympy, shellingham, safetensors, python-multipart, python-dotenv, pyreqwest-impersonate, pyproject_hooks, pycparser, Pillow, packaging, overrides, orjson, opentelemetry-util-http, opentelemetry-proto, oauthlib, nvidia-nvtx-cu12, nvidia-nvjitlink-cu12, nvidia-nccl-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, networkx, jsonpointer, jiter, inputimeout, importlib-resources, importlib-metadata, humanfriendly, httptools, httpcore, grpcio, greenlet, fsspec, filelock, email_validator, defusedxml, click, chroma-hnswlib, bcrypt, backoff, ansio, annotated-types, watchfiles, uvicorn, typing-inspect, triton, tiktoken, starlette, SQLAlchemy, requests-oauthlib, pydantic-core, posthog, opentelemetry-exporter-otlp-proto-common, nvidia-cusparse-cu12, nvidia-cudnn-cu12, marshmallow, jsonpatch, huggingface-hub, httpx, duckduckgo_search, docker, deprecated, coloredlogs, cffi, build, asgiref, typer, tokenizers, pydantic, opentelemetry-api, onnxruntime, nvidia-cusolver-cu12, kubernetes, google-auth-httplib2, dataclasses-json, transformers, torch, paramiko, opentelemetry-semantic-conventions, opentelemetry-instrumentation, openai, langsmith, groq, google-api-python-client, fastapi-cli, anthropic, sentence-transformers, opentelemetry-sdk, opentelemetry-instrumentation-asgi, langchain-core, google-ai-generativelanguage, fastapi, opentelemetry-instrumentation-fastapi, opentelemetry-exporter-otlp-proto-grpc, langchain-text-splitters, langchain-openai, langchain-huggingface, langchain-groq, langchain-anthropic, google-generativeai, langchain-google-genai, langchain, chromadb, langchain-community, langchain-chroma
Attempting uninstall: typing-extensions
Found existing installation: typing_extensions 4.5.0
Uninstalling typing_extensions-4.5.0:
Successfully uninstalled typing_extensions-4.5.0
Attempting uninstall: packaging
Found existing installation: packaging 23.1
Uninstalling packaging-23.1:
Successfully uninstalled packaging-23.1
Attempting uninstall: httpcore
Found existing installation: httpcore 0.17.3
Uninstalling httpcore-0.17.3:
Successfully uninstalled httpcore-0.17.3
Attempting uninstall: grpcio
Found existing installation: grpcio 1.57.0
Uninstalling grpcio-1.57.0:
Successfully uninstalled grpcio-1.57.0
Attempting uninstall: tiktoken
Found existing installation: tiktoken 0.3.3
Uninstalling tiktoken-0.3.3:
Successfully uninstalled tiktoken-0.3.3
Attempting uninstall: httpx
Found existing installation: httpx 0.24.1
Uninstalling httpx-0.24.1:
Successfully uninstalled httpx-0.24.1
Attempting uninstall: tokenizers
Found existing installation: tokenizers 0.13.3
Uninstalling tokenizers-0.13.3:
Successfully uninstalled tokenizers-0.13.3
Attempting uninstall: openai
Found existing installation: openai 0.27.8
Uninstalling openai-0.27.8:
Successfully uninstalled openai-0.27.8
Attempting uninstall: anthropic
Found existing installation: anthropic 0.2.8
Uninstalling anthropic-0.2.8:
Successfully uninstalled anthropic-0.2.8
Attempting uninstall: google-ai-generativelanguage
Found existing installation: google-ai-generativelanguage 0.2.0
Uninstalling google-ai-generativelanguage-0.2.0:
Successfully uninstalled google-ai-generativelanguage-0.2.0
Attempting uninstall: google-generativeai
Found existing installation: google-generativeai 0.1.0
Uninstalling google-generativeai-0.1.0:
Successfully uninstalled google-generativeai-0.1.0
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
gpt-command-line 0.1.3 requires anthropic==0.2.8, but you have anthropic 0.32.0 which is incompatible.
gpt-command-line 0.1.3 requires google-generativeai==0.1.0, but you have google-generativeai 0.7.2 which is incompatible.
gpt-command-line 0.1.3 requires openai==0.27.8, but you have openai 1.37.1 which is incompatible.
gpt-command-line 0.1.3 requires tiktoken==0.3.3, but you have tiktoken 0.7.0 which is incompatible.
gpt-command-line 0.1.3 requires tokenizers==0.13.3, but you have tokenizers 0.19.1 which is incompatible.
gpt-command-line 0.1.3 requires typing-extensions==4.5.0, but you have typing-extensions 4.12.2 which is incompatible.
Successfully installed Pillow-10.4.0 SQLAlchemy-2.0.31 annotated-types-0.7.0 ansio-0.0.1 anthropic-0.32.0 asgiref-3.8.1 backoff-2.2.1 bcrypt-4.2.0 build-1.2.1 cffi-1.16.0 chroma-hnswlib-0.7.6 chromadb-0.5.5 click-8.1.7 coloredlogs-15.0.1 dataclasses-json-0.6.7 defusedxml-0.7.1 deprecated-1.2.14 docker-7.1.0 duckduckgo_search-6.1.12 email_validator-2.2.0 fastapi-0.111.1 fastapi-cli-0.0.4 filelock-3.15.4 flatbuffers-24.3.25 fsspec-2024.6.1 google-ai-generativelanguage-0.6.6 google-api-python-client-2.139.0 google-auth-httplib2-0.2.0 google-generativeai-0.7.2 greenlet-3.0.3 groq-0.9.0 grpcio-1.65.1 httpcore-1.0.5 httptools-0.6.1 httpx-0.27.0 huggingface-hub-0.24.3 humanfriendly-10.0 importlib-metadata-8.0.0 importlib-resources-6.4.0 inputimeout-1.0.4 jiter-0.5.0 jsonpatch-1.33 jsonpointer-3.0.0 kubernetes-30.1.0 langchain-0.2.11 langchain-anthropic-0.1.19 langchain-chroma-0.1.2 langchain-community-0.2.7 langchain-core-0.2.25 langchain-google-genai-1.0.7 langchain-groq-0.1.6 langchain-huggingface-0.0.3 langchain-openai-0.1.15 langchain-text-splitters-0.2.2 langsmith-0.1.94 marshmallow-3.21.3 mmh3-4.1.0 monotonic-1.6 mpmath-1.3.0 networkx-3.3 nvidia-cublas-cu12-12.1.3.1 nvidia-cuda-cupti-cu12-12.1.105 nvidia-cuda-nvrtc-cu12-12.1.105 nvidia-cuda-runtime-cu12-12.1.105 nvidia-cudnn-cu12-9.1.0.70 nvidia-cufft-cu12-11.0.2.54 nvidia-curand-cu12-10.3.2.106 nvidia-cusolver-cu12-11.4.5.107 nvidia-cusparse-cu12-12.1.0.106 nvidia-nccl-cu12-2.20.5 nvidia-nvjitlink-cu12-12.5.82 nvidia-nvtx-cu12-12.1.105 oauthlib-3.2.2 onnxruntime-1.18.1 openai-1.37.1 opentelemetry-api-1.26.0 opentelemetry-exporter-otlp-proto-common-1.26.0 opentelemetry-exporter-otlp-proto-grpc-1.26.0 opentelemetry-instrumentation-0.47b0 opentelemetry-instrumentation-asgi-0.47b0 opentelemetry-instrumentation-fastapi-0.47b0 opentelemetry-proto-1.26.0 opentelemetry-sdk-1.26.0 opentelemetry-semantic-conventions-0.47b0 opentelemetry-util-http-0.47b0 orjson-3.10.6 overrides-7.7.0 packaging-24.1 paramiko-3.4.0 posthog-3.5.0 pycparser-2.22 pydantic-2.8.2 pydantic-core-2.20.1 pypika-0.48.9 pyproject_hooks-1.1.0 pyreqwest-impersonate-0.5.3 python-dotenv-1.0.1 python-multipart-0.0.9 requests-oauthlib-2.0.0 safetensors-0.4.3 sentence-transformers-3.0.1 shellingham-1.5.4 starlette-0.37.2 sympy-1.13.1 tenacity-8.5.0 tiktoken-0.7.0 tokenizers-0.19.1 torch-2.4.0 transformers-4.43.3 triton-3.0.0 typer-0.12.3 typing-extensions-4.12.2 typing-inspect-0.9.0 uritemplate-4.1.1 uvicorn-0.30.3 uvloop-0.19.0 watchfiles-0.22.0 webcolors-24.6.0 websockets-12.0 wrapt-1.16.0
.. which results in this error when starting main.py:
$> python3 main.py
Traceback (most recent call last):
File "/home/hvanmegen/repos/agent-zero/main.py", line 1, in <module>
import threading, time, models, os
File "/home/hvanmegen/repos/agent-zero/models.py", line 8, in <module>
from langchain_google_genai import ChatGoogleGenerativeAI, HarmBlockThreshold, HarmCategory
File "/home/hvanmegen/.local/lib/python3.10/site-packages/langchain_google_genai/__init__.py", line 58, in <module>
from langchain_google_genai._enums import HarmBlockThreshold, HarmCategory
File "/home/hvanmegen/.local/lib/python3.10/site-packages/langchain_google_genai/_enums.py", line 1, in <module>
import google.ai.generativelanguage_v1beta as genai
ModuleNotFoundError: No module named 'google.ai.generativelanguage_v1beta'
.. re-running it doesn't show me much:
$> pip install -r requirements.txt | grep -v "Requirement already satisfied"
Defaulting to user installation because normal site-packages is not writeable
Try installing onnxruntime directly, you may be on a linux distribution that the library doesnt support.
I'm on Macbook pro. I will try your suggestion, thanks
I have this error installing requirements. Could someone help with this issue?. I have installed C++ but dosen´t work
Building wheels for collected packages: chroma-hnswlib Building wheel for chroma-hnswlib (pyproject.toml) ... error error: subprocess-exited-with-error
× Building wheel for chroma-hnswlib (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [5 lines of output] running bdist_wheel running build running build_ext building 'hnswlib' extension error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/ [end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for chroma-hnswlib Failed to build chroma-hnswlib ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (chroma-hnswlib)
Chroma is no longer used. It is not present in requirements.txt
I have this error installing requirements. Could someone help with this issue?. I have installed C++ but dosen´t work
Building wheels for collected packages: chroma-hnswlib Building wheel for chroma-hnswlib (pyproject.toml) ... error error: subprocess-exited-with-error
× Building wheel for chroma-hnswlib (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [5 lines of output] running bdist_wheel running build running build_ext building 'hnswlib' extension error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/ [end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for chroma-hnswlib Failed to build chroma-hnswlib ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (chroma-hnswlib)
You need to install a c++ interpreter as numpy needs a C++ interpreter. Install "Microsoft C++ Build Tools" . During installation, select the "C++ build tools" workload and ensure the Windows 10 SDK is included. Additionally , use chatGPT to solve this issue. I did the exact same steps and now I'm good.