plandex icon indicating copy to clipboard operation
plandex copied to clipboard

Using plandex with litellm pointing to bedrock + claude won't work

Open jaysonsantos opened this issue 1 year ago • 4 comments

Hey there, I am trying to run plandex using litellm with the following config:

model_list:
  - model_name: gpt-4o
    litellm_params:
      model: bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
  - model_name: gpt-3.5-turbo 
    litellm_params:
      model: bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
litellm_settings:
    drop_params: true

and pointing plandex with OPENAI_API_BASE and when trying to tell, litellm returns this Error during plan name model call: error, status code: 400, message: litellm.BadRequestError: Invalid Message bedrock requires at least one non-system message and I wonder, shouldn't the initial prompt always be user instead of system?

this is the return on plandex's cli side

🚨 Server error
  → Error loading plan
    → Error generating plan name
      → Error, status code
        → 400, message
          → Litellm.BadRequestError
            → Invalid Message bedrock requires at least one non-system message
Received Model Group=gpt-3.5-turbo
Available Model Group Fallbacks=None

jaysonsantos avatar Nov 13 '24 12:11 jaysonsantos

Any update on this? @jaysonsantos did you find any workaround?

htxryan avatar Apr 22 '25 14:04 htxryan

@htxryan Are you getting the same error message, or you're looking to use a model via LiteLLM?

It does seem that Anthropic doesn't like getting a request with only a system message, so I'll add a fix for that.

Sorry about the slow reply btw @jaysonsantos!

danenania avatar Apr 22 '25 14:04 danenania

@danenania Thank you for the quick response and for this amazing project!

Wwe are looking at using Plandex and would like to wire it up via LiteLLM (including using Anthropic models). A fix for this issue would be much appreciated!

htxryan avatar Apr 26 '25 11:04 htxryan

@htxryan Sure thing! LiteLLM support is in-progress—it won't be in the very next release, but will be in the one after that, likely in another week or so.

danenania avatar Apr 26 '25 16:04 danenania

@htxryan Sure thing! LiteLLM support is in-progress—it won't be in the very next release, but will be in the one after that, likely in another week or so.

Is there an issue we can track for liteLLM support? OpenRouter doesnt work for those of use in enterprises where we need to private instances and litellm simplifies this.

bakes82 avatar May 18 '25 06:05 bakes82

@bakes82 Got a little sidetracked, but still working on it. Just tying up loose ends now—should get it released this week.

danenania avatar May 18 '25 16:05 danenania