instructor icon indicating copy to clipboard operation
instructor copied to clipboard

The Bedrock converse is not working

Open shreyashpatel404 opened this issue 8 months ago • 5 comments

I'm encountering an error when trying to run the example code from the Instructor documentation for Amazon Bedrock integration. The example code fails with parameter validation errors.

Error Traceback (most recent call last): File "C:\Work Stuff\New folder\swo-ai-canvas-financial-agent\intstrucotr.py", line 22, in user = client.converse( File "C:\Users\ext.shreyash.patel\AppData\Local\miniconda3\envs\swo-env-1\lib\site-packages\botocore\client.py", line 570, in _api_call return self._make_api_call(operation_name, kwargs) File "C:\Users\ext.shreyash.patel\AppData\Local\miniconda3\envs\swo-env-1\lib\site-packages\botocore\context.py", line 124, in wrapper return func(*args, **kwargs) File "C:\Users\ext.shreyash.patel\AppData\Local\miniconda3\envs\swo-env-1\lib\site-packages\botocore\client.py", line 988, in _make_api_call
request_dict = self._convert_to_request_dict( File "C:\Users\ext.shreyash.patel\AppData\Local\miniconda3\envs\swo-env-1\lib\site-packages\botocore\client.py", line 1055, in _convert_to_request_dict request_dict = self._serializer.serialize_to_request( File "C:\Users\ext.shreyash.patel\AppData\Local\miniconda3\envs\swo-env-1\lib\site-packages\botocore\validate.py", line 381, in serialize_to_request raise ParamValidationError(report=report.generate_report()) botocore.exceptions.ParamValidationError: Parameter validation failed:
Unknown parameter in input: "response_model", must be one of: modelId, messages, system, inferenceConfig, toolConfig, guardrailConfig, additionalModelRequestFields, promptVariables, additionalModelResponseFieldPaths, requestMetadata, performanceConfig Invalid type for parameter messages[0].content, value: Extract: Jason is 25 years old, type: <class 'str'>, valid types: <class 'list'>, <class 'tuple'>

shreyashpatel404 avatar Apr 07 '25 10:04 shreyashpatel404

I was facing the same issue. I believe the docs are outdated. I found an alternative approach that works:

user = client.chat.completions.create( modelId="anthropic.claude-3-sonnet-20240229-v1:0", system=[{'text': 'You are a helpful assistant.'}], messages=[{'role': 'user', 'content': [{'text': "Extract: Jason is 22 years old"}]}], response_model=User, )

system is optional. The format of messages and system is from BedrockRuntime.Client.converse docs (check Request Syntax for more).

I tried to use client.converse with the above formats, but it does not accept the input parameter response_model

vittoriop17 avatar Apr 07 '25 15:04 vittoriop17

Okay, Thanks!

shreyashpatel404 avatar Apr 08 '25 03:04 shreyashpatel404

@vittoriop17 Were you able to make this work using Mode.BEDROCK_TOOLS?

ClydeAmazing avatar Apr 14 '25 21:04 ClydeAmazing

Still not working with BEDROCK_TOOLS

pazevedo-hyland avatar Apr 23 '25 16:04 pazevedo-hyland

I can confirm the error as well. Is there any way to support this @jxnl ? Unfortunately, the traceback doesn't indicate that the error stems from the instructor library patching the client incorrectly. Any hints where I could start?

Steinkreis avatar Apr 27 '25 09:04 Steinkreis

For those using anthropic through bedrock.

it's possible to use anthropic bedrock wrapper from anthropic import AnthropicBedrock to init the client and use with instructor.from_anthropic(client), instead of using boto3.

This also allows for async out of the box with not having to use other aws services.

from anthropic import AnthropicBedrock

bedrock_anthropic = AnthropicBedrock(
    # Authenticate by either providing the keys below or use the default AWS credential providers, such as
    # using ~/.aws/credentials or the "AWS_SECRET_ACCESS_KEY" and "AWS_ACCESS_KEY_ID" environment variables.
    aws_access_key="<access key>",
    aws_secret_key="<secret key>",
    # Temporary credentials can be used with aws_session_token.
    # Read more at https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp.html.
    aws_session_token="<session_token>",
    # aws_region changes the aws region to which the request is made. By default, we read AWS_REGION,
    # and if that's not present, we default to us-east-1. Note that we do not read ~/.aws/config for the region.
    aws_region="us-west-2",
)

client = instructor.from_anthropic(bedrock_anthropic)

model_name = 'eu.anthropic.claude-3-7-sonnet-20250219-v1:0'

client.messages.create(
            model=model_name, # use the aws model name here
            system=system_prompt,
            messages=[{"role": "user", "content": user_content}], # your content as anthropic structure
            response_model=output_model, # your pydantic model
            **params
        )


purdonkle avatar May 13 '25 11:05 purdonkle

Here are two options that work for me...

First using AnthropicBedrock:

import os
import anthropic
import instructor
from pydantic import BaseModel

client = anthropic.AnthropicBedrock(
    aws_access_key=os.getenv("AWS_ACCESS_KEY_ID"),
    aws_secret_key=os.getenv("AWS_SECRET_ACCESS_KEY"),
    aws_session_token=os.getenv("AWS_SESSION_TOKEN"),
    aws_region=os.getenv("AWS_DEFAULT_REGION"),
)

instructor_client = instructor.from_anthropic(
    client,
    max_tokens=1024,
    model="anthropic.claude-3-haiku-20240307-v1:0"
)

class User(BaseModel):
    name: str
    age: int
    

user = instructor_client.chat.completions.create(
    system='You are a helpful assistant.',
    messages=[{'role': 'user', 'content': "Extract: Jason is 22 years old"}],
    response_model=User,
)

print(user)
# > User(name='Jason', age=22)

then next using boto3:

import os
import boto3
import instructor
from pydantic import BaseModel

# Initialize the Bedrock client
bedrock_client = boto3.client(
    'bedrock-runtime',
    aws_access_key_id=os.getenv("AWS_ACCESS_KEY_ID"),
    aws_secret_access_key=os.getenv("AWS_SECRET_ACCESS_KEY"),
    aws_session_token=os.getenv("AWS_SESSION_TOKEN"),
    region_name=os.getenv("AWS_REGION")
)

# Enable instructor patches for Bedrock client
instructor_client = instructor.from_bedrock(bedrock_client)


class User(BaseModel):
    name: str
    age: int


user = instructor_client.chat.completions.create(
    modelId="anthropic.claude-3-haiku-20240307-v1:0",
    system=[{'text': 'You are a helpful assistant.'}],
    messages=[{'role': 'user', 'content': [{'text': "Extract: Jason is 22 years old"}]}],
    response_model=User,
    )

print(user)
# > User(name='Jason', age=22)

There are minor differences between both, but mainly the structure of the system and messages fields.

jaeyow avatar May 15 '25 02:05 jaeyow

We have a lot of instructor code that uses chat.completions.create that I don't want to migrate, especially if we move off Bedrock. In light of this I've created a PR that processes the model and messages kwarg differences: https://github.com/567-labs/instructor/pull/1529

This would mean that you could transparently switch from from_openai (or whatever) to from_bedrock without having to change instances of client.chat.completions.create.

I haven't tested this with the newer from_provider stuff, so no idea if that works.

I separately created https://github.com/567-labs/instructor/pull/1528 to fix the documentation but this will be invalidated by #1529 once it's merged.

dogonthehorizon avatar May 15 '25 22:05 dogonthehorizon