[Bug]: litellm fails to process title request from librechat
What happened?
What happened? when running litellm, everything appears to work EXCEPT the titlemodel functionality from librechat.
recorded request from librechat
{ "model": "", "user": "", "temperature": 0.2, "presence_penalty": 0, "frequency_penalty": 0, "max_tokens": 16, "messages": [ { "role": "system", "content": "Please generate a concise, 5-word-or-less title for the conversation, using its same language, with no punctuation. Apply title case conventions appropriate for the language. Never directly mention the language name or the word "title"\n\n||>User:\n"test title"\n||>Response:\n""I don't see any text to check beyond "test title." Could you please share the full text you'd like me to look at? I can help evaluate titles or review any other text you provide.""\n\n||>Title:" } ] }
and the response from litellm
litellm.BadRequestError: Invalid Message bedrock requires at least one non-system message
no matter what title model we pick, its the same result
Version Information latest
Steps to Reproduce route traffic to litellm proxy observe logs
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.65.4-stable
Twitter / LinkedIn details
No response
posted here and librechat is telling us this is an issue with litellm
https://github.com/danny-avila/LibreChat/issues/6804
I see the same issue on 1.68-stable and 1.69-stable and LibreChat 0.7.8
One combination that works is calling the invoke endpoint explicitly ie model: bedrock/invoke/anthropic.claude-3-5-haiku-20241022-v1:0 in combination with the config.yaml setting modify_params to true
litellm_settings:
modify_params: true
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.