autogen
autogen copied to clipboard
[Feature Request] Amazon Bedrock support?
Hi, would it be possible to add support for Amazon Bedrock LLMs?
@austinmw this PR is still unresolved but might be worth checking out https://github.com/microsoft/autogen/pull/95
Does anyone know if there is interest in a native implementation of Bedrock? (Instead of using LiteLLM as proposed on the PR mentioned above) I was planning on tackling one as I haven't found anything along those lines. Not sure I like the idea of introducing yet another layer of abstraction to my stack.
Implementing a custom client is a good start. See https://microsoft.github.io/autogen/docs/topics/non-openai-models/cloud-anthropic
@sebandres How are you getting on ? :)
Hey @jtlz2 ! I ended up implementing a version the TextMessage that leverages Bedrock but after doing it didn't really agree with some of the patterns used to implement the sdk so opted for a clean architecture version that does Agent choreography instead of orchestration so ended up not pursuing this any further.
I added for the anthropic models available through bedrock.
You can pull from here until the merge is done to main branch.
Below is the example usage
import autogen config_list = [ { "model": "anthropic.claude-3-5-sonnet-20240620-v1:0", "aws_access_key":accessKey, "aws_secret_key":secretKey, "aws_session_token":sessionTok, "aws_region":"us-east-1", "api_type": "anthropic", } ] assistant = autogen.AssistantAgent("assistant", llm_config={"config_list": config_list})
@makkzone I only got your solution to work after changing line 490 in file: autogen/oai/client.py
from:
client = AnthropicClient(**openai_config)
to
client = AnthropicClient(**config)
If i didn't I ran into the following error: ValueError: API key or AWS credentials are required to use the Anthropic API.