crewAI icon indicating copy to clipboard operation
crewAI copied to clipboard

[BUG] Pydantic error with CrewAi + langchain_ollama

Open widarr opened this issue 1 year ago • 5 comments

Description

I defined my llms as following:

` from crewai import Agent, Crew, Process, Task from crewai.project import CrewBase, agent, crew, task from langchain_ollama import ChatOllama import os

os.environ["OPENAI_API_KEY"] = "NA"

class PersonalityCrew(): """Personality crew""" agents_config = 'config/agents.yaml' tasks_config = 'config/tasks.yaml'

chat_llm = ChatOllama(
	model="rolandroland/llama3.1-uncensored:latest",
	base_url="http://localhost:11434"
)

instruct_llm = ChatOllama(
	model="mistral:instruct",
	base_url="http://localhost:11434"
)

... `

I have defined an agent like this:

@agent def researcher(self) -> Agent: return Agent( config=self.agents_config['researcher'], # tools=[MyCustomTool()], # Example of custom tool, loaded on the beginning of file verbose=True, llm=self.chat_llm, function_calling_llm=self.instruct_llm )

I get following error in the console:

Failed to convert text into a pydantic model due to the following error: 1 validation error for CrewPydanticOutputParser pydantic_object subclass of BaseModel expected (type=type_error.subclass; expected_class=BaseModel)

Steps to Reproduce

Just followed the GettingStarted section on crewai.com and the tutorial on using local LLMs.

Expected behavior

No error message, working agents/crew

Screenshots/Code snippets

Error message like described above

Operating System

Other (specify in additional context)

Python Version

3.12

crewAI Version

0.51.1

crewAI Tools Version

0.8.3

Virtual Environment

Venv

Evidence

`> Entering new CrewAgentExecutor chain... Action: Delegate work to coworker Action Input: { 'task': "Conduct research about AI LLMs", 'context': "AI LLMs are a type of artificial intelligence that uses deep learning techniques to generate human-like language. They have gained popularity in recent years due to their ability to understand and respond to natural language inputs.", 'coworker': "AI LLMs Reporting Analyst"

Failed to convert text into a pydantic model due to the following error: 1 validation error for CrewPydanticOutputParser pydantic_object subclass of BaseModel expected (type=type_error.subclass; expected_class=BaseModel) `

Possible Solution

Check if pydantic version of langchain is compatible with CrewAI?

Additional context

OS: Manjaro Linux

widarr avatar Aug 23 '24 15:08 widarr

Do you have a Pydantic model defined anywhere? please post all your code

theCyberTech avatar Aug 23 '24 15:08 theCyberTech

Do you have a Pydantic model defined anywhere? please post all your code

Like I stated in my bug report I did not really deviate from the GettingStarted tutorial, except adding the langchain ollama integration like mentioned here: https://docs.crewai.com/how-to/LLM-Connections/#ollama-integration-step-by-step-ex-for-using-llama-31-8b-locally

I created a new project with 'crewai create crew PROJECTNAME' and changed the crew.py file to use ollama. Here is my crew.py file. crew.py.zip

widarr avatar Aug 23 '24 22:08 widarr

Any update here? I have the same error but using the parameter function_calling_llm with Azure OpenAI

Action Input: { "wiql": "SELECT [System.Id], [System.Title], [System.Description], [System.State] FROM WorkItems WHERE [System.TeamProject] = 'Kairo LC'", "project_name": "Kairo LC" }

Failed to convert text into a pydantic model due to the following error: 1 validation error for CrewPydanticOutputParser pydantic_object subclass of BaseModel expected (type=type_error.subclass; expected_class=BaseModel)

Carlososuna11 avatar Sep 09 '24 19:09 Carlososuna11

Any update for a fix?
in my case.. pydantic errors on adding memory : EX: memory=True, embedder={ "provider": "huggingface", "config": { "model": 'mixedbread-ai/mxbai-embed-large-v1', } }

installed -> pip install langchain_huggingface as instructed pydantic blows up!

The required dependencies for HuggingFaceHub are not installed.Please install with pip install langchain_huggingface root@74f3bf8ff518:/app/crewAI# pip install langchain_huggingface Collecting langchain_huggingface Obtaining dependency information for langchain_huggingface from https://files.pythonhosted.org/packages/4f/9b/86f1bb19739ca510503abb4dca65deff9c95dc318bde2e51ce8c4fa7847c/langchain_huggingface-0.1.0-py3-none-any.whl.metadata Downloading langchain_huggingface-0.1.0-py3-none-any.whl.metadata (1.3 kB) Requirement already satisfied: huggingface-hub>=0.23.0 in /usr/local/lib/python3.11/site-packages (from langchain_huggingface) (0.25.0) Collecting langchain-core<0.4,>=0.3.0 (from langchain_huggingface) Obtaining dependency information for langchain-core<0.4,>=0.3.0 from https://files.pythonhosted.org/packages/ab/04/608e974a8ae6f125629bdbe8c4ba02364fb8ff989e819e237c558d9c109a/langchain_core-0.3.1-py3-none-any.whl.metadata Downloading langchain_core-0.3.1-py3-none-any.whl.metadata (6.2 kB) Collecting sentence-transformers>=2.6.0 (from langchain_huggingface) Obtaining dependency information for sentence-transformers>=2.6.0 from https://files.pythonhosted.org/packages/90/b4/52b8205f24172f2429cacf04bac324414f16b61d64e79c787c9ce2385586/sentence_transformers-3.1.0-py3-none-any.whl.metadata Downloading sentence_transformers-3.1.0-py3-none-any.whl.metadata (23 kB) Requirement already satisfied: tokenizers>=0.19.1 in /usr/local/lib/python3.11/site-packages (from langchain_huggingface) (0.20.0) Collecting transformers>=4.39.0 (from langchain_huggingface) Obtaining dependency information for transformers>=4.39.0 from https://files.pythonhosted.org/packages/75/35/07c9879163b603f0e464b0f6e6e628a2340cfc7cdc5ca8e7d52d776710d4/transformers-4.44.2-py3-none-any.whl.metadata Downloading transformers-4.44.2-py3-none-any.whl.metadata (43 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 43.7/43.7 kB 2.5 MB/s eta 0:00:00 Requirement already satisfied: filelock in /usr/local/lib/python3.11/site-packages (from huggingface-hub>=0.23.0->langchain_huggingface) (3.16.0) Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.11/site-packages (from huggingface-hub>=0.23.0->langchain_huggingface) (2024.9.0) Requirement already satisfied: packaging>=20.9 in /usr/local/lib/python3.11/site-packages (from huggingface-hub>=0.23.0->langchain_huggingface) (24.1) Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.11/site-packages (from huggingface-hub>=0.23.0->langchain_huggingface) (6.0.2) Requirement already satisfied: requests in /usr/local/lib/python3.11/site-packages (from huggingface-hub>=0.23.0->langchain_huggingface) (2.32.3) Requirement already satisfied: tqdm>=4.42.1 in /usr/local/lib/python3.11/site-packages (from huggingface-hub>=0.23.0->langchain_huggingface) (4.66.5) Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.11/site-packages (from huggingface-hub>=0.23.0->langchain_huggingface) (4.12.2) Requirement already satisfied: jsonpatch<2.0,>=1.33 in /usr/local/lib/python3.11/site-packages (from langchain-core<0.4,>=0.3.0->langchain_huggingface) (1.33) Requirement already satisfied: langsmith<0.2.0,>=0.1.117 in /usr/local/lib/python3.11/site-packages (from langchain-core<0.4,>=0.3.0->langchain_huggingface) (0.1.121) Requirement already satisfied: pydantic<3.0.0,>=2.5.2 in /usr/local/lib/python3.11/site-packages (from langchain-core<0.4,>=0.3.0->langchain_huggingface) (2.9.2) Requirement already satisfied: tenacity!=8.4.0,<9.0.0,>=8.1.0 in /usr/local/lib/python3.11/site-packages (from langchain-core<0.4,>=0.3.0->langchain_huggingface) (8.5.0) Collecting torch>=1.11.0 (from sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for torch>=1.11.0 from https://files.pythonhosted.org/packages/ea/ea/4ab009e953bca6ff35ad75b8ab58c0923308636c182c145dc63084f7d136/torch-2.4.1-cp311-cp311-manylinux1_x86_64.whl.metadata Downloading torch-2.4.1-cp311-cp311-manylinux1_x86_64.whl.metadata (26 kB) Requirement already satisfied: numpy<2.0.0 in /usr/local/lib/python3.11/site-packages (from sentence-transformers>=2.6.0->langchain_huggingface) (1.26.4) Collecting scikit-learn (from sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for scikit-learn from https://files.pythonhosted.org/packages/49/21/3723de321531c9745e40f1badafd821e029d346155b6c79704e0b7197552/scikit_learn-1.5.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata Downloading scikit_learn-1.5.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (13 kB) Collecting scipy (from sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for scipy from https://files.pythonhosted.org/packages/93/6b/701776d4bd6bdd9b629c387b5140f006185bd8ddea16788a44434376b98f/scipy-1.14.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata Downloading scipy-1.14.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (60 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.8/60.8 kB 5.9 MB/s eta 0:00:00 Requirement already satisfied: Pillow in /usr/local/lib/python3.11/site-packages (from sentence-transformers>=2.6.0->langchain_huggingface) (10.4.0) Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.11/site-packages (from transformers>=4.39.0->langchain_huggingface) (2024.9.11) Collecting safetensors>=0.4.1 (from transformers>=4.39.0->langchain_huggingface) Obtaining dependency information for safetensors>=0.4.1 from https://files.pythonhosted.org/packages/e6/ee/69e498a892f208bd1da4104d4b9be887f8611bf4942144718b6738482250/safetensors-0.4.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata Downloading safetensors-0.4.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.8 kB) Collecting tokenizers>=0.19.1 (from langchain_huggingface) Obtaining dependency information for tokenizers>=0.19.1 from https://files.pythonhosted.org/packages/a7/03/fb50fc03f86016b227a967c8d474f90230c885c0d18f78acdfda7a96ce56/tokenizers-0.19.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata Downloading tokenizers-0.19.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.7 kB) Requirement already satisfied: jsonpointer>=1.9 in /usr/local/lib/python3.11/site-packages (from jsonpatch<2.0,>=1.33->langchain-core<0.4,>=0.3.0->langchain_huggingface) (3.0.0) Requirement already satisfied: httpx<1,>=0.23.0 in /usr/local/lib/python3.11/site-packages (from langsmith<0.2.0,>=0.1.117->langchain-core<0.4,>=0.3.0->langchain_huggingface) (0.27.2) Requirement already satisfied: orjson<4.0.0,>=3.9.14 in /usr/local/lib/python3.11/site-packages (from langsmith<0.2.0,>=0.1.117->langchain-core<0.4,>=0.3.0->langchain_huggingface) (3.10.7) Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.11/site-packages (from pydantic<3.0.0,>=2.5.2->langchain-core<0.4,>=0.3.0->langchain_huggingface) (0.7.0) Requirement already satisfied: pydantic-core==2.23.4 in /usr/local/lib/python3.11/site-packages (from pydantic<3.0.0,>=2.5.2->langchain-core<0.4,>=0.3.0->langchain_huggingface) (2.23.4) Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.11/site-packages (from requests->huggingface-hub>=0.23.0->langchain_huggingface) (3.3.2) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.11/site-packages (from requests->huggingface-hub>=0.23.0->langchain_huggingface) (3.10) Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.11/site-packages (from requests->huggingface-hub>=0.23.0->langchain_huggingface) (2.2.3) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.11/site-packages (from requests->huggingface-hub>=0.23.0->langchain_huggingface) (2024.8.30) Requirement already satisfied: sympy in /usr/local/lib/python3.11/site-packages (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) (1.13.2) Collecting networkx (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for networkx from https://files.pythonhosted.org/packages/38/e9/5f72929373e1a0e8d142a130f3f97e6ff920070f87f91c4e13e40e0fba5a/networkx-3.3-py3-none-any.whl.metadata Downloading networkx-3.3-py3-none-any.whl.metadata (5.1 kB) Requirement already satisfied: jinja2 in /usr/local/lib/python3.11/site-packages (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) (3.1.4) Collecting nvidia-cuda-nvrtc-cu12==12.1.105 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-cuda-nvrtc-cu12==12.1.105 from https://files.pythonhosted.org/packages/b6/9f/c64c03f49d6fbc56196664d05dba14e3a561038a81a638eeb47f4d4cfd48/nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata Downloading nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cuda-runtime-cu12==12.1.105 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-cuda-runtime-cu12==12.1.105 from https://files.pythonhosted.org/packages/eb/d5/c68b1d2cdfcc59e72e8a5949a37ddb22ae6cade80cd4a57a84d4c8b55472/nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata Downloading nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cuda-cupti-cu12==12.1.105 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-cuda-cupti-cu12==12.1.105 from https://files.pythonhosted.org/packages/7e/00/6b218edd739ecfc60524e585ba8e6b00554dd908de2c9c66c1af3e44e18d/nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata Downloading nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata (1.6 kB) Collecting nvidia-cudnn-cu12==9.1.0.70 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-cudnn-cu12==9.1.0.70 from https://files.pythonhosted.org/packages/9f/fd/713452cd72343f682b1c7b9321e23829f00b842ceaedcda96e742ea0b0b3/nvidia_cudnn_cu12-9.1.0.70-py3-none-manylinux2014_x86_64.whl.metadata Downloading nvidia_cudnn_cu12-9.1.0.70-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB) Collecting nvidia-cublas-cu12==12.1.3.1 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-cublas-cu12==12.1.3.1 from https://files.pythonhosted.org/packages/37/6d/121efd7382d5b0284239f4ab1fc1590d86d34ed4a4a2fdb13b30ca8e5740/nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl.metadata Downloading nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cufft-cu12==11.0.2.54 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-cufft-cu12==11.0.2.54 from https://files.pythonhosted.org/packages/86/94/eb540db023ce1d162e7bea9f8f5aa781d57c65aed513c33ee9a5123ead4d/nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl.metadata Downloading nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB) Collecting nvidia-curand-cu12==10.3.2.106 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-curand-cu12==10.3.2.106 from https://files.pythonhosted.org/packages/44/31/4890b1c9abc496303412947fc7dcea3d14861720642b49e8ceed89636705/nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl.metadata Downloading nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cusolver-cu12==11.4.5.107 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-cusolver-cu12==11.4.5.107 from https://files.pythonhosted.org/packages/bc/1d/8de1e5c67099015c834315e333911273a8c6aaba78923dd1d1e25fc5f217/nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl.metadata Downloading nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl.metadata (1.6 kB) Collecting nvidia-cusparse-cu12==12.1.0.106 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-cusparse-cu12==12.1.0.106 from https://files.pythonhosted.org/packages/65/5b/cfaeebf25cd9fdec14338ccb16f6b2c4c7fa9163aefcf057d86b9cc248bb/nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl.metadata Downloading nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl.metadata (1.6 kB) Collecting nvidia-nccl-cu12==2.20.5 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-nccl-cu12==2.20.5 from https://files.pythonhosted.org/packages/4b/2a/0a131f572aa09f741c30ccd45a8e56316e8be8dfc7bc19bf0ab7cfef7b19/nvidia_nccl_cu12-2.20.5-py3-none-manylinux2014_x86_64.whl.metadata Downloading nvidia_nccl_cu12-2.20.5-py3-none-manylinux2014_x86_64.whl.metadata (1.8 kB) Collecting nvidia-nvtx-cu12==12.1.105 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-nvtx-cu12==12.1.105 from https://files.pythonhosted.org/packages/da/d3/8057f0587683ed2fcd4dbfbdfdfa807b9160b809976099d36b8f60d08f03/nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata Downloading nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata (1.7 kB) Collecting triton==3.0.0 (from torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for triton==3.0.0 from https://files.pythonhosted.org/packages/33/3e/a2f59384587eff6aeb7d37b6780de7fedd2214935e27520430ca9f5b7975/triton-3.0.0-1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.metadata Downloading triton-3.0.0-1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.metadata (1.3 kB) Collecting nvidia-nvjitlink-cu12 (from nvidia-cusolver-cu12==11.4.5.107->torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for nvidia-nvjitlink-cu12 from https://files.pythonhosted.org/packages/a8/48/a9775d377cb95585fb188b469387f58ba6738e268de22eae2ad4cedb2c41/nvidia_nvjitlink_cu12-12.6.68-py3-none-manylinux2014_x86_64.whl.metadata Downloading nvidia_nvjitlink_cu12-12.6.68-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting joblib>=1.2.0 (from scikit-learn->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for joblib>=1.2.0 from https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl.metadata Downloading joblib-1.4.2-py3-none-any.whl.metadata (5.4 kB) Collecting threadpoolctl>=3.1.0 (from scikit-learn->sentence-transformers>=2.6.0->langchain_huggingface) Obtaining dependency information for threadpoolctl>=3.1.0 from https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl.metadata Downloading threadpoolctl-3.5.0-py3-none-any.whl.metadata (13 kB) Requirement already satisfied: anyio in /usr/local/lib/python3.11/site-packages (from httpx<1,>=0.23.0->langsmith<0.2.0,>=0.1.117->langchain-core<0.4,>=0.3.0->langchain_huggingface) (4.4.0) Requirement already satisfied: httpcore==1.* in /usr/local/lib/python3.11/site-packages (from httpx<1,>=0.23.0->langsmith<0.2.0,>=0.1.117->langchain-core<0.4,>=0.3.0->langchain_huggingface) (1.0.5) Requirement already satisfied: sniffio in /usr/local/lib/python3.11/site-packages (from httpx<1,>=0.23.0->langsmith<0.2.0,>=0.1.117->langchain-core<0.4,>=0.3.0->langchain_huggingface) (1.3.1) Requirement already satisfied: h11<0.15,>=0.13 in /usr/local/lib/python3.11/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->langsmith<0.2.0,>=0.1.117->langchain-core<0.4,>=0.3.0->langchain_huggingface) (0.14.0) Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.11/site-packages (from jinja2->torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) (2.1.5) Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.11/site-packages (from sympy->torch>=1.11.0->sentence-transformers>=2.6.0->langchain_huggingface) (1.3.0) Downloading langchain_huggingface-0.1.0-py3-none-any.whl (20 kB) Downloading langchain_core-0.3.1-py3-none-any.whl (405 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 405.1/405.1 kB 5.5 MB/s eta 0:00:00 Downloading sentence_transformers-3.1.0-py3-none-any.whl (249 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 249.1/249.1 kB 4.2 MB/s eta 0:00:00 Downloading transformers-4.44.2-py3-none-any.whl (9.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.5/9.5 MB 36.6 MB/s eta 0:00:00 Downloading tokenizers-0.19.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.6/3.6 MB 43.5 MB/s eta 0:00:00 Downloading safetensors-0.4.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (435 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 435.0/435.0 kB 50.6 MB/s eta 0:00:00 Downloading torch-2.4.1-cp311-cp311-manylinux1_x86_64.whl (797.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 797.1/797.1 MB 7.5 MB/s eta 0:00:00 Downloading nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl (410.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 410.6/410.6 MB 12.1 MB/s eta 0:00:00 Downloading nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (14.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.1/14.1 MB 40.3 MB/s eta 0:00:00 Downloading nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (23.7 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 23.7/23.7 MB 39.0 MB/s eta 0:00:00 Downloading nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (823 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 823.6/823.6 kB 48.6 MB/s eta 0:00:00 Downloading nvidia_cudnn_cu12-9.1.0.70-py3-none-manylinux2014_x86_64.whl (664.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 664.8/664.8 MB 8.6 MB/s eta 0:00:00 Downloading nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl (121.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 121.6/121.6 MB 24.6 MB/s eta 0:00:00 Downloading nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl (56.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.5/56.5 MB 32.9 MB/s eta 0:00:00 Downloading nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl (124.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 124.2/124.2 MB 25.1 MB/s eta 0:00:00 Downloading nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl (196.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 196.0/196.0 MB 19.3 MB/s eta 0:00:00 Downloading nvidia_nccl_cu12-2.20.5-py3-none-manylinux2014_x86_64.whl (176.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 176.2/176.2 MB 21.2 MB/s eta 0:00:00 Downloading nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (99 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 99.1/99.1 kB 21.7 MB/s eta 0:00:00 Downloading triton-3.0.0-1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (209.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 209.4/209.4 MB 19.2 MB/s eta 0:00:00 Downloading scikit_learn-1.5.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (13.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13.3/13.3 MB 40.6 MB/s eta 0:00:00 Downloading scipy-1.14.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (41.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 41.2/41.2 MB 34.7 MB/s eta 0:00:00 Downloading joblib-1.4.2-py3-none-any.whl (301 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 301.8/301.8 kB 59.4 MB/s eta 0:00:00 Downloading threadpoolctl-3.5.0-py3-none-any.whl (18 kB) Downloading networkx-3.3-py3-none-any.whl (1.7 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.7/1.7 MB 44.3 MB/s eta 0:00:00 Downloading nvidia_nvjitlink_cu12-12.6.68-py3-none-manylinux2014_x86_64.whl (19.7 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 19.7/19.7 MB 38.2 MB/s eta 0:00:00 Installing collected packages: triton, threadpoolctl, scipy, safetensors, nvidia-nvtx-cu12, nvidia-nvjitlink-cu12, nvidia-nccl-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, networkx, joblib, scikit-learn, nvidia-cusparse-cu12, nvidia-cudnn-cu12, tokenizers, nvidia-cusolver-cu12, transformers, torch, langchain-core, sentence-transformers, langchain_huggingface Attempting uninstall: tokenizers Found existing installation: tokenizers 0.20.0 Uninstalling tokenizers-0.20.0: Successfully uninstalled tokenizers-0.20.0 Attempting uninstall: langchain-core Found existing installation: langchain-core 0.2.40 Uninstalling langchain-core-0.2.40: Successfully uninstalled langchain-core-0.2.40 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. langchain-experimental 0.0.65 requires langchain-core<0.3.0,>=0.2.38, but you have langchain-core 0.3.1 which is incompatible. langchain-openai 0.1.25 requires langchain-core<0.3.0,>=0.2.40, but you have langchain-core 0.3.1 which is incompatible. langchain-text-splitters 0.2.4 requires langchinstalling ain-core<0.3.0,>=0.2.38, but you have langchain-core 0.3.1 which is incompatible. langchain-cohere 0.1.9 requires langchain-core<0.3,>=0.2.2, but you have langchain-core 0.3.1 which is incompatible. langchain-community 0.2.17 requires langchain-core<0.3.0,>=0.2.39, but you have langchain-core 0.3.1 which is incompatible. langchain 0.2.16 requires langchain-core<0.3.0,>=0.2.38, but you have langchain-core 0.3.1 which is incompatible. Successfully installed joblib-1.4.2 langchain-core-0.3.1 langchain_huggingface-0.1.0 networkx-3.3 nvidia-cublas-cu12-12.1.3.1 nvidia-cuda-cupti-cu12-12.1.105 nvidia-cuda-nvrtc-cu12-12.1.105 nvidia-cuda-runtime-cu12-12.1.105 nvidia-cudnn-cu12-9.1.0.70 nvidia-cufft-cu12-11.0.2.54 nvidia-curand-cu12-10.3.2.106 nvidia-cusolver-cu12-11.4.5.107 nvidia-cusparse-cu12-12.1.0.106 nvidia-nccl-cu12-2.20.5 nvidia-nvjitlink-cu12-12.6.68 nvidia-nvtx-cu12-12.1.105 safetensors-0.4.5 scikit-learn-1.5.2 scipy-1.14.1 sentence-transformers-3.1.0 threadpoolctl-3.5.0 tokenizers-0.19.1 torch-2.4.1 transformers-4.44.2 triton-3.0.0 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

[notice] A new release of pip is available: 23.2.1 -> 24.2 [notice] To update, run: pip install --upgrade pip root@74f3bf8ff518:/app/crewAI#

/usr/local/lib/python3.11/site-packages/langchain_openai/chat_models/init.py:1: LangChainDeprecationWarning: As of langchain-core 0.3.0, LangChain uses pydantic v2 internally. The langchain_core.pydantic_v1 module was a compatibility shim for pydantic v1, and should no longer be used. Please update the code to import from Pydantic directly.

For example, replace imports like: from langchain_core.pydantic_v1 import BaseModel with: from pydantic import BaseModel or the v1 compatibility namespace if you are working in a code base that has not been fully upgraded to pydantic 2 yet. from pydantic.v1 import BaseModel

from langchain_openai.chat_models.azure import AzureChatOpenAI /usr/local/lib/python3.11/site-packages/pydantic/_internal/_config.py:341: UserWarning: Valid config keys have changed in V2:

  • 'allow_population_by_field_name' has been renamed to 'populate_by_name' warnings.warn(message, UserWarning) Traceback (most recent call last): File "/app/crewAI/dr_jekyll_mr_hyde_crew.py", line 8, in from crewai import Crew, Agent, Task, Process File "/usr/local/lib/python3.11/site-packages/crewai/init.py", line 2, in from crewai.agent import Agenthttps://errors.pydantic.dev/2.9/u/custom-json-schema File "/usr/local/lib/python3.11/site-packages/crewai/agent.py", line 9, in from crewai.agents.crew_agent_executor import CrewAgentExecutor File "/usr/local/lib/python3.11/site-packages/crewai/agents/crew_agent_executor.py", line 5, in from crewai.agents.agent_builder.base_agent_executor_mixin import CrewAgentExecutorMixin File "/usr/local/lib/python3.11/site-packages/crewai/agents/agent_builder/base_agent_executor_mixin.py", line 4, in from crewai.memory.entity.entity_memory_item import EntityMemoryItem File "/usr/local/lib/python3.11/site-packages/crewai/memory/init.py", line 1, in from .entity.entity_memory import EntityMemory File "/usr/local/lib/python3.11/site-packages/crewai/memory/entity/entity_memory.py", line 3, in from crewai.memory.storage.rag_storage import RAGStorage File "/usr/local/lib/python3.11/site-packages/crewai/memory/storage/rag_storage.py", line 8, in from embedchain import App File "/usr/local/lib/python3.11/site-packages/embedchain/init.py", line 5, in from embedchain.app import App # noqa: F401 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/embedchain/app.py", line 33, in from embedchain.llm.openai import OpenAILlm File "/usr/local/lib/python3.11/site-packages/embedchain/llm/openai.py", line 8, in from langchain_openai import ChatOpenAI File "/usr/local/lib/python3.11/site-packages/langchain_openai/init.py", line 1, in from langchain_openai.chat_models import AzureChatOpenAI, ChatOpenAI File "/usr/local/lib/python3.11/site-packages/langchain_openai/chat_models/init.py", line 1, in from langchain_openai.chat_models.azure import AzureChatOpenAI File "/usr/local/lib/python3.11/site-packages/langchain_openai/chat_models/azure.py", line 41, in from langchain_openai.chat_models.base import BaseChatOpenAI File "/usr/local/lib/python3.11/site-packages/langchain_openai/chat_models/base.py", line 379, in class BaseChatOpenAI(BaseChatModel): File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_model_construction.py", line 224, in new complete_model_class( File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_model_construction.py", line 577, in complete_model_class schema = cls.get_pydantic_core_schema(cls, handler) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/main.py", line 671, in get_pydantic_core_schema return handler(source) ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_schema_generation_shared.py", line 83, in call schema = self._handler(source_type) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 655, in generate_schema schema = self._generate_schema_inner(obj) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 924, in _generate_schema_inner return self._model_schema(obj) ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 739, in _model_schema {k: self._generate_md_field_schema(k, v, decorators) for k, v in fields.items()}, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 739, in {k: self._generate_md_field_schema(k, v, decorators) for k, v in fields.items()}, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 1115, in _generate_md_field_schema common_field = self._common_field_schema(name, field_info, decorators) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 1308, in _common_field_schema schema = self._apply_annotations( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 2107, in _apply_annotations schema = get_inner_schema(source_type) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_schema_generation_shared.py", line 83, in call schema = self._handler(source_type) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 2088, in inner_handler schema = self._generate_schema_inner(obj) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 929, in _generate_schema_inner return self.match_type(obj) ^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 1029, in match_type return self._match_generic_type(obj, origin) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 1058, in _match_generic_type return self._union_schema(obj) ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 1378, in _union_schema choices.append(self.generate_schema(arg)) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 657, in generate_schema metadata_js_function = _extract_get_pydantic_json_schema(obj, schema) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/pydantic/_internal/_generate_schema.py", line 2447, in _extract_get_pydantic_json_schema raise PydanticUserError( pydantic.errors.PydanticUserError: The __modify_schema__ method is not supported in Pydantic v2. Use __get_pydantic_json_schema__ instead in class SecretStr.

For further information visit https://errors.pydantic.dev/2.9/u/custom-json-schema

quantumalchemy avatar Sep 18 '24 16:09 quantumalchemy

to get memory to work with huggingface embed (in my case) downgrade langchain_huggingface==0.0.3

quantumalchemy avatar Sep 19 '24 13:09 quantumalchemy

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Oct 20 '24 12:10 github-actions[bot]

Do you have a Pydantic model defined anywhere? please post all your code

Like I stated in my bug report I did not really deviate from the GettingStarted tutorial, except adding the langchain ollama integration like mentioned here: https://docs.crewai.com/how-to/LLM-Connections/#ollama-integration-step-by-step-ex-for-using-llama-31-8b-locally

I created a new project with 'crewai create crew PROJECTNAME' and changed the crew.py file to use ollama. Here is my crew.py file. crew.py.zip

Here's how I used the LLM class from CrewAI, and it worked for me. Check out the code below:


from crewai import Agent, Crew, Process, Task, LLM

... 

@CrewBase
class Toy1Crew():
	"""Personality crew"""
	agents_config = 'config/agents.yaml'
	tasks_config = 'config/tasks.yaml'


	@agent
	def researcher(self) -> Agent:
		return Agent(
			config=self.agents_config['researcher'],
			# tools=[MyCustomTool()], # Example of custom tool, loaded on the beginning of file
			verbose=True,
			llm=LLM(model="ollama/llama3.1:8b", base_url="http://localhost:11434"),
			function_calling_llm=LLM(model="ollama/mistral:7b-instruct", base_url="http://localhost:11434"),
		)

	@agent
	def reporting_analyst(self) -> Agent:
		return Agent(
			config=self.agents_config['reporting_analyst'],
			verbose=True,
			llm=LLM(model="ollama/llama3.1:8b", base_url="http://localhost:11434"),
			function_calling_llm=LLM(model="ollama/mistral:7b-instruct", base_url="http://localhost:11434"),
		)

...

flingjie avatar Oct 24 '24 01:10 flingjie

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Nov 23 '24 12:11 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar Nov 29 '24 12:11 github-actions[bot]