ChatDev
ChatDev copied to clipboard
feat: Proxied Model Support
This PR adds support for models served behind OpenAI-compatible proxies e.g. LiteLLM or Bedrock Access Gateway (missing image generation support). The constraint is to change as little as possible of the original implementation.
Example (LiteLLM >> Amazon Bedrock)
-
Set up AWS credentials
export AWS_ACCESS_KEY_ID=... export AWS_SECRET_ACCESS_KEY=... export AWS_SESSION_TOKEN=...
-
Start LiteLLM proxy
litellm --config config.yaml
# config.yaml # Adapted from https://litellm.vercel.app/docs/proxy/configs model_list: - model_name: claude3-sonnet litellm_params: model: bedrock/anthropic.claude-3-sonnet-20240229-v1:0 aws_region_name: us-east-1 temperature: 0.0 - model_name: sdxl litellm_params: model: bedrock/stability.stable-diffusion-xl-v1 aws_region_name: us-east-1 litellm_settings: drop_params: True modify_params: True # hack to get around Claude's Message API restrictions general_settings: master_key: correct-horse-battery-staple
-
Set up OpenAI and ChatDev env vars
# OpenAI export OPENAI_BASE_URL=http://0.0.0.0:4000 export OPENAI_API_KEY=correct-horse-battery-staple # ChatDev export CHATDEV_CUSTOM_MODEL=claude3-sonnet export CHATDEV_NUM_MAX_TOKEN=200000 export CHATDEV_CUSTOM_IMAGE_MODEL=sdxl
-
Run ChatDev
python3 run.py --task "Design a simple game of tic-tac-toe" --name "TicTacToe" --org "THUNLP" --config "Default"
https://github.com/OpenBMB/ChatDev/assets/7282984/462218bc-4b7e-4dc1-8e0f-6051e7d6d658