litellm
litellm copied to clipboard
[Bug]: Groq models don't support Pydantic Message models
What happened?
Groq models are able to ingest messages in Python format but not with your Message
Pydantic model
This works fine:
from litellm import completion
messages = [{ "content": "Hello, how are you?","role": "user"}]
completion(model="groq/llama-3.1-8b-instant", messages=messages)
This doesn't work:
from litellm import completion, Message
messages = [Message(content="Hello, how are you?", role="user")]
completion(model="groq/llama-3.1-8b-instant", messages=messages)
Relevant log output
litellm.exceptions.APIError: litellm.APIError: APIError: GroqException - Object of type Message is not JSON serializable
Twitter / LinkedIn details
No response