private-llm-aws
private-llm-aws copied to clipboard
Add Bedrock Support
Hi @wesdottoday - we can make this easier I’m the maintainer of LiteLLM - we allow you to deploy an LLM proxy to call 100+ LLMs in 1 format - Bedrock, OpenAI, Anthropic etc https://github.com/BerriAI/litellm/tree/main/openai-proxy.
If this looks useful - please let me know how we can help
Usage
Bedrock request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "bedrock/anthropic.claude-instant-v1",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
gpt-3.5-turbo request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
claude-2 request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "claude-2",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'