Using plandex with litellm pointing to bedrock + claude won't work
Hey there, I am trying to run plandex using litellm with the following config:
model_list:
- model_name: gpt-4o
litellm_params:
model: bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
- model_name: gpt-3.5-turbo
litellm_params:
model: bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
litellm_settings:
drop_params: true
and pointing plandex with OPENAI_API_BASE
and when trying to tell, litellm returns this Error during plan name model call: error, status code: 400, message: litellm.BadRequestError: Invalid Message bedrock requires at least one non-system message and I wonder, shouldn't the initial prompt always be user instead of system?
this is the return on plandex's cli side
🚨 Server error
→ Error loading plan
→ Error generating plan name
→ Error, status code
→ 400, message
→ Litellm.BadRequestError
→ Invalid Message bedrock requires at least one non-system message
Received Model Group=gpt-3.5-turbo
Available Model Group Fallbacks=None
Any update on this? @jaysonsantos did you find any workaround?
@htxryan Are you getting the same error message, or you're looking to use a model via LiteLLM?
It does seem that Anthropic doesn't like getting a request with only a system message, so I'll add a fix for that.
Sorry about the slow reply btw @jaysonsantos!
@danenania Thank you for the quick response and for this amazing project!
Wwe are looking at using Plandex and would like to wire it up via LiteLLM (including using Anthropic models). A fix for this issue would be much appreciated!
@htxryan Sure thing! LiteLLM support is in-progress—it won't be in the very next release, but will be in the one after that, likely in another week or so.
@htxryan Sure thing! LiteLLM support is in-progress—it won't be in the very next release, but will be in the one after that, likely in another week or so.
Is there an issue we can track for liteLLM support? OpenRouter doesnt work for those of use in enterprises where we need to private instances and litellm simplifies this.
@bakes82 Got a little sidetracked, but still working on it. Just tying up loose ends now—should get it released this week.