langchaingo icon indicating copy to clipboard operation
langchaingo copied to clipboard

Provider Request: AWS Bedrock

Open pedro-a-n-moreira opened this issue 5 months ago • 3 comments

Hi, Is there any reason why AWS Bedrock is not supported as a provider yet? If not, are there any plans for it? I'm asking because the original LangChain library supports it. I'm interested in building the integration if it is the case.

pedro-a-n-moreira avatar Jul 04 '25 20:07 pedro-a-n-moreira

@pedro-a-n-moreira Bedrock already exists as a provider. There is an example here.

I have successfully used the bedrock provider with meta.llama3-2-3b-instruct-v1:0 and anthropic.claude-sonnet-4-20250514-v1:0 for text inference, but there is some issues with Amazon Nova I raised in Issue #1336

HeinKalido avatar Jul 22 '25 16:07 HeinKalido

~~Not sure if this is related to Nova, but Bedrock is only supported with on-demand calls as far as I can see.~~ edit: Nevermind, using the main branch seems to work.

Call with anthropic.claude-sonnet-4-20250514-v1:0:

operation error Bedrock Runtime: InvokeModel, https response error StatusCode: 400, RequestID: 62289f28-0be4-49a8-8711-09b5d410098a, ValidationException: Invocation of model ID anthropic.claude-sonnet-4-20250514-v1:0 with on-demand throughput isn’t supported. Retry your request with the ID or ARN of an inference profile that contains this model.

Call with us.anthropic.claude-sonnet-4-20250514-v1:0:

unsupported provider

sollniss avatar Jul 30 '25 03:07 sollniss

I am also experiencing this. In Amazon Bedrock in order to invoke some models you must use "Inference Profile".

This basically means you to prefix the modelId with the region area (ex us.anthropic....) or an inference ARN (ARN pointing to a model in a specific region). Both won't work in langchaingo.

The first will get a unsupported provider error and the second will get a malicious model id error.

This is due to the fact that the langchaingo bedrock's agent expects the first part of the model id (split by ".") to be the provider, when in fact it's actually us.

So you can't use inference profile at all in amazon bedrock with this library @HeinKalido so you can utilize amazon bedrock but under very limited conditions and obviously for non production uses. Please let me know if you recognize this as an Issue that should be resolved. I would love to contribute and propose a PR to fix this and support prefixes/ARNs for amazon bedrock ModelID.

liadyacobi avatar Sep 17 '25 16:09 liadyacobi