AWS Bedrock support
Hi Jack 👋 ,
Probably a long-shot (given the large API differences), but are there any plans to support AWS Bedrock?
Given Amazon's investment into Anthropic, we've seen an uptick in interest in using Bedrock to run LLMs on their existing cloud infrastructure, and we'd like to continue using magentic.
Probably something better for Litellm?
You could try it via the LitellmChatModel since they seem to support it https://docs.litellm.ai/docs/providers/bedrock
From the docs here https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters.html , the request (including messages) and response are different between the different models hosted on Bedrock. So magentic would need to know how to map from/to each of these, which would probably be easiest as a different ChatModel for each.
If Anthropic on Bedrock is all you need, you could create a copy of the AnthropicChatModel and just replace the Anthropic client with AnthropicBedrock client. https://docs.anthropic.com/en/api/claude-on-amazon-bedrock#making-requests
here: https://github.com/jackmpcollins/magentic/blob/d7ae5c26ff259448a5d1cf9bad3c10340949a37c/src/magentic/chat_model/anthropic_chat_model.py#L315-L318
Other models on Bedrock might similarly be accessible through their provider's SDK, which could be a nice way to implement this because then those providers would also be supported in magentic. This is currently only the case for OpenAI on Azure.