[Bug]: moderation endpoint is broken
What happened?
When using the moderation from openai, littlellm seem to send the prefix "openai/" to the openai api
const response = await openai.moderations.create({
input: text,
// model: "omni-moderation-latest"
model: "text-moderation-latest"
});
Relevant log output
litellm-1 | INFO: 172.30.0.1:43458 - "POST /v1/moderations HTTP/1.1" 400 Bad Request
litellm-1 | 17:05:41 - LiteLLM Proxy:ERROR: proxy_server.py:5442 - litellm.proxy.proxy_server.moderations(): Exception occured - Error code: 400 - {'error': {'message': "Invalid value for 'model' = openai/text-moderation-latest. Please check the OpenAI documentation and try again.", 'type': 'invalid_request_error', 'param': 'model', 'code': None}
Twitter / LinkedIn details
No response
const response = await openai.moderations.create({ input: text, // model: "omni-moderation-latest" model: "text-moderation-latest" });
can i see how you set this up on your config? would help to repro
I'm using the Docker version, and here is my config:
- model_name: omni-moderation-latest
litellm_params:
model: openai/omni-moderation-latest
- model_name: text-moderation-stable
litellm_params:
model: openai/text-moderation-stable
- model_name: text-moderation-latest
litellm_params:
model: openai/text-moderation-latest
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.