ml-commons
ml-commons copied to clipboard
[BUG] Unexpected schema validation error comes from model interface
What is the bug? When predict a ml model, even if the connector payload is filled, I still receives the error from model interface.
{
"error": {
"root_cause": [
{
"type": "status_exception",
"reason": "Error validating input schema: Validation failed: [$.parameters: required property 'inputs' not found] for instance: {\"algorithm\":\"REMOTE\",\"parameters\":{\"prompt\":\"hello, who are you?\"},\"action_type\":\"PREDICT\"} with schema: {\n \"type\": \"object\",\n \"properties\": {\n \"parameters\": {\n \"type\": \"object\",\n \"properties\": {\n \"inputs\": {\n \"type\": \"string\"\n }\n },\n \"required\": [\n \"inputs\"\n ]\n }\n },\n \"required\": [\n \"parameters\"\n ]\n}"
}
],
"type": "status_exception",
"reason": "Error validating input schema: Validation failed: [$.parameters: required property 'inputs' not found] for instance: {\"algorithm\":\"REMOTE\",\"parameters\":{\"prompt\":\"hello, who are you?\"},\"action_type\":\"PREDICT\"} with schema: {\n \"type\": \"object\",\n \"properties\": {\n \"parameters\": {\n \"type\": \"object\",\n \"properties\": {\n \"inputs\": {\n \"type\": \"string\"\n }\n },\n \"required\": [\n \"inputs\"\n ]\n }\n },\n \"required\": [\n \"parameters\"\n ]\n}"
},
"status": 400
}
How can one reproduce the bug?
- Create a connector for claude v2 model from the connector blue print. Only change
${parameters.inputs}to${parameters.prompt}.
POST /_plugins/_ml/connectors/_create
{
"name": "Amazon Bedrock",
"description": "Test connector for Amazon Bedrock",
"version": 1,
"protocol": "aws_sigv4",
"credential": {
"access_key": "...",
"secret_key": "..."
},
"parameters": {
"region": "...",
"service_name": "bedrock",
"model": "anthropic.claude-v2"
},
"actions": [
{
"action_type": "predict",
"method": "POST",
"headers": {
"content-type": "application/json"
},
"url": "https://bedrock-runtime.${parameters.region}.amazonaws.com/model/${parameters.model}/invoke",
"request_body": """{"prompt":"\n\nHuman: ${parameters.prompt}\n\nAssistant:","max_tokens_to_sample":300,"temperature":0.5,"top_k":250,"top_p":1,"stop_sequences":["\\n\\nHuman:"]}"""
}
]
}
- Register model with connector
POST /_plugins/_ml/models/_register
{
"name": "anthropic.claude-v2",
"function_name": "remote",
"description": "test model",
"connector_id": "6uQTs5IB3hywALe5wWxu"
}
- Run model prediction only with
inputsparameter. You'll receive connector payload validation error sincepromptparameter is needed. This is correct.
POST /_plugins/_ml/models/IY9qt5IB1dQFBoCTo8nv/_predict/
{
"parameters": {
"inputs": "hello, who are you?"
}
}
- Run model prediction only with
promptparameter. You'll receive model interface error.
POST /_plugins/_ml/models/IY9qt5IB1dQFBoCTo8nv/_predict/
{
"parameters": {
"prompt": "hello, who are you?"
}
}
What is the expected behavior?
I should receive the model response as long as the connector payload is filled. I also find it quite weird that the bug won't be reproduced if I omit step 3.
What is your host/environment?
- OS: MacOS
- Version main
- Plugins ml-commons
Do you have any screenshots? If applicable, add screenshots to help explain your problem.
Do you have any additional context? Add any other context about the problem.