Error when running Anthropic plugin and models
Affected plugin(s)
- [X] Anthropic
- [ ] Azure OpenAI
- [ ] Cohere
- [ ] Convex
- [ ] Groq
- [ ] Mistral
- [ ] OpenAI
Describe the bug When running generation using dotprompt and the Anthropic plugin, I get the error: roles must alternate between "user" and "assistant", but found multiple "user" roles in a row. Same code works fine when running with OpenAI plugin or Gemini models.
To Reproduce Run generation with Claude models.
Expected behavior Generation should succeed
Screenshots
Plugin(s) version: 0.10.1
Genkit version: 0.5.13
If I add a {{ role "user" }} somewhere in the prompt, the error changed to "role system doesn't map to an Anthropic role.".
Seems it all comes down to the {{ role "system" }} clause when used in prompt templates. The plugin doesn't translate it well to the Claude API.
Hey @nechmads can you share an example prompt and/or code. I think I see what you are likely running into - but want to verify. Vertex AI also supports Anthropic, so I'd like to verify it in both places if possible.
Here is the code from each of the plugins, Anthropic doesn't support system at all. Perhaps we should just map it to "assistant"
Strangely OpenAi doesn't seem to support System role in some of their models, if you pass in a system role within messages (as the first message lets say) there appears to be some "magic" that happens, perhaps, we should just convert it to a role the system understands an just warn the call rather than trying to enforce restrictions as it makes coding it across models complicated.
function toAnthropicRole(role, toolMessageType) {
switch (role) {
case "user":
return "user";
case "model":
return "assistant";
case "tool":
return toolMessageType === "tool_use" ? "assistant" : "user";
default:
throw new Error(role ${role} doesn't map to an Anthropic role.);
}
}
function toGroqRole(role) {
switch (role) {
case "user":
return "user";
case "model":
case "tool":
return "assistant";
case "system":
return "system";
default:
throw new Error(role ${role} doesn't map to a Groq role.);
}
}
function toMistralRole(role) {
switch (role) {
case "user":
return "user";
case "model":
return "assistant";
case "system":
return "system";
case "tool":
return "assistant";
default:
throw new Error(Role ${role} doesn't map to a Mistral role.);
}
}
function toOpenAIRole(role) {
switch (role) {
case "user":
return "user";
case "model":
return "assistant";
case "system":
return "system";
case "tool":
return "tool";
default:
throw new Error(role ${role} doesn't map to an OpenAI role.);
}
}
function toCohereRole(role) {
switch (role) {
case "user":
return "USER";
case "model":
return "CHATBOT";
case "system":
return "SYSTEM";
case "tool":
return "CHATBOT";
default:
throw new Error(role ${role} doesn't map to a Cohere role.);
}
}
function toGeminiRole( role: MessageData['role'], model?: ModelReference<z.ZodTypeAny> ): string { switch (role) { case 'user': return 'user'; case 'model': return 'model'; case 'system': if (model && SUPPORTED_V15_MODELS[model.name]) { // We should have already pulled out the supported system messages, // anything remaining is unsupported; throw an error. throw new Error( 'system role is only supported for a single message in the first position' ); } else { throw new Error('system role is not supported'); } case 'tool': return 'function'; default: return 'user'; } }
Where export const SUPPORTED_V15_MODELS = { 'gemini-1.5-pro': gemini15Pro, 'gemini-1.5-flash': gemini15Flash, 'gemini-1.5-flash-8b': gemini15Flash8b, 'gemini-2.0-flash': gemini20Flash, 'gemini-2.0-pro-exp-02-05': gemini20ProExp0205, };
this should be fixed in the latest version