[BUG] An error occurred (ValidationException) when calling the Converse operation: messages.4.content: Conversation blocks and tool result blocks cannot be provided in the same turn
Checks
- [x] I have updated to the lastest minor and patch version of Strands
- [x] I have checked the documentation and this is not expected behavior
- [x] I have searched ./issues and there are no duplicates of my issue
Strands Version
1.14.0
Python Version
3.13.5
Operating System
macOS
Installation Method
pip
Steps to Reproduce
- Use models like
llama4-maverick-17b-instruct-v1:0 - Add some tools to the agent and use structured output
- Try to execute the agentic workflow where tool invocation is required
- Do multiple conversations so that previous messages are sent in the second conversation.
Expected Behavior
The agent should run without any error and tool execution should be successful.
Actual Behavior
The agent run fails with the exception An error occurred (ValidationException) when calling the Converse operation: messages.4.content: Conversation blocks and tool result blocks cannot be provided in the same turn.
Additional Context
return self.agent(enhanced_instruction) ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/agent/agent.py", line 427, in __call__ return run_async( lambda: self.invoke_async( prompt, invocation_state=invocation_state, structured_output_model=structured_output_model, **kwargs ) ) File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/_async.py", line 31, in run_async return future.result() ~~~~~~~~~~~~~^^ File "/opt/homebrew/Cellar/[email protected]/3.13.5/Frameworks/Python.framework/Versions/3.13/lib/python3.13/concurrent/futures/_base.py", line 456, in result return self.__get_result() ~~~~~~~~~~~~~~~~~^^ File "/opt/homebrew/Cellar/[email protected]/3.13.5/Frameworks/Python.framework/Versions/3.13/lib/python3.13/concurrent/futures/_base.py", line 401, in __get_result raise self._exception File "/opt/homebrew/Cellar/[email protected]/3.13.5/Frameworks/Python.framework/Versions/3.13/lib/python3.13/concurrent/futures/thread.py", line 59, in run result = self.fn(*self.args, **self.kwargs) File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/opentelemetry/instrumentation/threading/__init__.py", line 171, in wrapped_func return original_func(*func_args, **func_kwargs) File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/_async.py", line 27, in execute return asyncio.run(execute_async()) ~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/[email protected]/3.13.5/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/runners.py", line 195, in run return runner.run(main) ~~~~~~~~~~^^^^^^ File "/opt/homebrew/Cellar/[email protected]/3.13.5/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ File "/opt/homebrew/Cellar/[email protected]/3.13.5/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/base_events.py", line 725, in run_until_complete return future.result() ~~~~~~~~~~~~~^^ File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/_async.py", line 24, in execute_async return await async_func() ^^^^^^^^^^^^^^^^^^ File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/agent/agent.py", line 470, in invoke_async async for event in events: _ = event File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/agent/agent.py", line 666, in stream_async async for event in events: ...<5 lines>... yield as_dict File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/agent/agent.py", line 746, in _run_loop async for event in events: ...<13 lines>... yield event File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/agent/agent.py", line 794, in _execute_event_loop_cycle async for event in events: yield event File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/event_loop/event_loop.py", line 152, in event_loop_cycle async for model_event in model_events: if not isinstance(model_event, ModelStopReason): yield model_event File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/event_loop/event_loop.py", line 391, in _handle_model_execution raise e File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/event_loop/event_loop.py", line 337, in _handle_model_execution async for event in stream_messages( ...<2 lines>... yield event File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/event_loop/streaming.py", line 441, in stream_messages async for event in process_stream(chunks, start_time): yield event File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/event_loop/streaming.py", line 388, in process_stream async for chunk in chunks: ...<22 lines>... handle_redact_content(chunk["redactContent"], state) File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/models/bedrock.py", line 633, in stream await task File "/opt/homebrew/Cellar/[email protected]/3.13.5/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/threads.py", line 25, in to_thread return await loop.run_in_executor(None, func_call) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/[email protected]/3.13.5/Frameworks/Python.framework/Versions/3.13/lib/python3.13/concurrent/futures/thread.py", line 59, in run result = self.fn(*self.args, **self.kwargs) File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/opentelemetry/instrumentation/threading/__init__.py", line 171, in wrapped_func return original_func(*func_args, **func_kwargs) File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/models/bedrock.py", line 754, in _stream raise e File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/strands/models/bedrock.py", line 703, in _stream response = self.client.converse(**request) File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/botocore/client.py", line 602, in _api_call return self._make_api_call(operation_name, kwargs) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/botocore/context.py", line 123, in wrapper return func(*args, **kwargs) File "/Users/ajay.kumar/Documents/Github/agentic/.venv/lib/python3.13/site-packages/botocore/client.py", line 1078, in _make_api_call raise error_class(parsed_response, operation_name) botocore.errorfactory.ValidationException: An error occurred (ValidationException) when calling the Converse operation: messages.4.content: Conversation blocks and tool result blocks cannot be provided in the same turn. └ Bedrock region: us-east-1 └ Model id: us.meta.llama4-maverick-17b-instruct-v1:0
Possible Solution
No response
Related Issues
No response
Below is the integration test to reproduce the issue. The issue is replicable with models other than Anthropic models.
Comprehensive integration tests for structured output passed into the agent functionality.
"""
from pydantic import BaseModel, Field
from strands import Agent
from strands.models.bedrock import BedrockModel
from strands.tools import tool
class MathResult(BaseModel):
"""Math operation result."""
operation: str = Field(description="the performed operation")
result: int = Field(description="the result of the operation")
# ========== Tool Definitions ==========
@tool
def calculator(operation: str, a: float, b: float) -> float:
"""Simple calculator tool for testing."""
if operation == "add":
return a + b
elif operation == "subtract":
return a - b
elif operation == "multiply":
return a * b
elif operation == "divide":
return b / a if a != 0 else 0
elif operation == "power":
return a**b
else:
return 0
# ========== Test Classes ==========
class TestBedrockLlamaModelsToolUsageWithStructuredOutput:
"""Test structured output with tool usage."""
def test_multi_turn_calculator_tool_use_with_structured_output(self):
"""Test tool usage with structured output."""
model = BedrockModel(
model_id="us.meta.llama4-maverick-17b-instruct-v1:0",
region_name="us-east-1",
max_tokens=2048,
streaming=False
)
agent = Agent(
model=model,
tools=[calculator]
)
result = agent("Calculate 2 + 2 using the calculator tool", structured_output_model=MathResult)
assert result.structured_output is not None
assert isinstance(result.structured_output, MathResult)
assert result.structured_output.result == 4
# Check that tool was called
assert result.metrics.tool_metrics is not None
assert len(result.metrics.tool_metrics) > 0
result = agent("What is 5 multiplied by 3? Use the calculator tool.",
structured_output_model=MathResult)
assert result.structured_output is not None
@dbschmigelski Could you please have a look at this one and suggest any workarounds if any? This issue not limited to Llama models.
This issue occurs only in case of using structured_ouput_model.
Hi, @azaylamba I was able to reproduce this. Thanks for raising this.
I am going to engage with the Bedrock team first. This appears like Bedrock is providing an inconsistent interface. I would rather see if we can have this fixed centrally, for all frameworks, rather than just in Strands. I will let keep this thread up to date with those conversations.
Hi @dbschmigelski, thanks for the update. There is another similar issue https://github.com/strands-agents/sdk-python/issues/1241 which again is not occurring for Anthropic models but occurring for other providers on bedrock.
Hi @dbschmigelski Did you get a chance to discuss this further internally?
Hi, sorry for the delay. We've been busy with a lot of Re:Invent launches: Bidirectional streaming, typescript sdk, evals, and steering.
I reached out to Bedrock and they are aware of the issue and working on issues like this one. In the meantime though we will need to patch over it ourselves. I am going to make some changes to your PR though.
The majority of models DO support this behavior; Mistral, Llama. So what I want to avoid is impacting the Sliding Window conversation management by inserting messages when we don't need to - even if the token usage is not impacted.
What I believe strands is missing is a config strategy for inter and intra provider specific logic.
I am going to start #780, which will unblock this.
Thanks for the update @dbschmigelski