continue
continue copied to clipboard
Amazon Bedrock inference (provider: bedrockimport) models don't work
Before submitting your bug report
- [x] I believe this is a bug. I'll try to join the Continue Discord for questions
- [x] I'm not able to find an open issue that reports the same bug
- [x] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: mint
- Continue version: 1.1.48
- IDE version: 1.101.0
- Model: several
- config:
name: Local Assistant
version: 1.0.0
schema: v1
models:
- name: AWS Bedrock Claude 4 Sonnet
provider: bedrockimport
model: eu.anthropic.claude-sonnet-4-20250514-v1:0
env:
modelArn: arn:aws:bedrock:eu-south-2:XXXXXXXXXXXX:inference-profile/eu.anthropic.claude-sonnet-4-20250514-v1:0
region: eu-south-2
profile: bedrock
roles:
- chat
- name: AWS Bedrock Nova Pro
provider: bedrockimport
model: eu.amazon.nova-pro-v1:0
env:
modelArn: arn:aws:bedrock:eu-south-2:XXXXXXXXXXXX:inference-profile/eu.amazon.nova-pro-v1:0
region: eu-south-2
profile: bedrock
roles:
- chat
- name: AWS Bedrock Deepseek R1
provider: bedrockimport
model: us.deepseek.r1-v1:0
env:
modelArn: arn:aws:bedrock:us-west-2:XXXXXXXXXXXX:inference-profile/us.deepseek.r1-v1:0
region: us-west-2
profile: bedrock
roles:
- chat
with XXXXXXXXXXXX as profile arn
Description
While adding regular models from Amazon Bedrock works without issues, adding inference models with a modelArn doesn't seems to work appropriatly. Please check section below for logs.
The documented way of configuring costum imported models doesn't seems to work and throws
Error No value provided for input HTTP label: modelId.
-> the modelArn must be part of env
To reproduce
- Add config above to your local assistant
- Replace the XXXXXXXXXXXX with your profile arn
- enable access of the models in Amazon Bedrock
- Add your credentials to the ~/.aws/credentials file
- Execute the test prompt
Results
A prompt of "Say Hi" is leading to different issues: Claude 4 Sonnet
Error You must either implement templateMessages or _streamChat
Nova Pro
Error Malformed input request: #: required key [messages] not found, please reformat your input and try again.
Deepseek R1
Error Malformed JSON received from Bedrock: {"choices":[{"text":"Hi","stop_reason":null}]}
Other prompts are leading to similar issues. Deepseek seems to gerate a cut output. Probably the first line or so. Claude is the only one getting the system prompt. Others get the user prompt only (for unknown reasons).