langflow
langflow copied to clipboard
CrewAI flows throw pydantic issues
Bug Description
When I try to run the default Hierarchical Agent for Crewai, after some iterations it throws pydantic errors. I am aware that it shows errors regarding langchain, but maybe you have seen this in the past? Without proper logging, it is also hard to me to show at which step it gets stuck. Is there a way to get the file logger or something similar working for crewAI on langflow?
Error building Component Hierarchical Crew:
1 validation error for AIMessageChunk
usage_metadata -> input_tokens
none is not an allowed value (type=type_error.none.not_allowed)
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/langflow/graph/vertex/base.py", line 690, in _build_results
result = await initialize.loading.get_instance_results(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langflow/interface/initialize/loading.py", line 60, in get_instance_results
return await build_component(params=custom_params, custom_component=custom_component)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langflow/interface/initialize/loading.py", line 147, in build_component
build_results, artifacts = await custom_component.build_results()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 570, in build_results
return await self._build_with_tracing()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 560, in _build_with_tracing
_results, _artifacts = await self._build_results()
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 595, in _build_results
result = await result
^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langflow/base/agents/crewai/crew.py", line 81, in build_output
result = await crew.kickoff_async()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/crewai/crew.py", line 394, in kickoff_async
return await asyncio.to_thread(self.kickoff, inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/asyncio/threads.py", line 25, in to_thread
return await loop.run_in_executor(None, func_call)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/asyncio/futures.py", line 287, in __await__
yield self # This tells Task to wait for completion.
^^^^^^^^^^
File "/usr/local/lib/python3.12/asyncio/tasks.py", line 385, in __wakeup
future.result()
File "/usr/local/lib/python3.12/asyncio/futures.py", line 203, in result
raise self._exception.with_traceback(self._exception_tb)
File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/crewai/crew.py", line 348, in kickoff
result, manager_metrics = self._run_hierarchical_process() # type: ignore # Incompatible types in assignment (expression has type "str | dict[str, Any]", variable has type "str")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/crewai/crew.py", line 501, in _run_hierarchical_process
task_output = task.execute(
^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/crewai/task.py", line 217, in execute
result = self._execute(
^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/crewai/task.py", line 226, in _execute
result = agent.execute_task(
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/crewai/agent.py", line 185, in execute_task
result = self.agent_executor.invoke(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain/chains/base.py", line 164, in invoke
raise e
File "/app/.venv/lib/python3.12/site-packages/langchain/chains/base.py", line 154, in invoke
self._call(inputs, run_manager=run_manager)
File "/app/.venv/lib/python3.12/site-packages/crewai/agents/executor.py", line 80, in _call
next_step_output = self._take_next_step(
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain/agents/agent.py", line 1314, in _take_next_step
[
File "/app/.venv/lib/python3.12/site-packages/crewai/agents/executor.py", line 144, in _iter_next_step
output = self.agent.plan( # type: ignore # Incompatible types in assignment (expression has type "AgentAction | AgentFinish | list[AgentAction]", variable has type "AgentAction")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain/agents/agent.py", line 461, in plan
for chunk in self.runnable.stream(inputs, config={"callbacks": callbacks}):
File "/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 3262, in stream
yield from self.transform(iter([input]), config, **kwargs)
File "/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 3249, in transform
yield from self._transform_stream_with_config(
File "/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 2054, in _transform_stream_with_config
chunk: Output = context.run(next, iterator) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 3211, in _transform
for output in final_pipeline:
File "/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1272, in transform
for ichunk in input:
File "/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 5301, in transform
yield from self.bound.transform(
File "/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1290, in transform
yield from self.stream(final, config, **kwargs)
File "/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 410, in stream
raise e
File "/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 390, in stream
for chunk in self._stream(messages, stop=stop, **kwargs):
File "/app/.venv/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 572, in _stream
message=default_chunk_class( # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langchain_core/messages/ai.py", line 94, in __init__
super().__init__(content=content, **kwargs)
File "/app/.venv/lib/python3.12/site-packages/langchain_core/messages/base.py", line 66, in __init__
super().__init__(content=content, **kwargs)
File "/app/.venv/lib/python3.12/site-packages/langchain_core/load/serializable.py", line 113, in __init__
super().__init__(*args, **kwargs)
File "/app/.venv/lib/python3.12/site-packages/pydantic/v1/main.py", line 341, in __init__
raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for AIMessageChunk
usage_metadata -> input_tokens
none is not an allowed value (type=type_error.none.not_allowed)
Reproduction
- use langflow 1.0.17
- create new flow, from template - Hierarchical agent
Expected behavior
Get the response from the crew
Who can help?
@italojohnny @ogabrielluiz @nicoloboschi @zzzming
Operating System
Ubuntu 24 LTS on Azure VM
Langflow Version
1.0.17
Python Version
3.12
Screenshot
Flow File
No response