ml-commons icon indicating copy to clipboard operation
ml-commons copied to clipboard

[BUG] Unexpected schema validation error comes from model interface

Open yuye-aws opened this issue 1 year ago • 26 comments

What is the bug? When predict a ml model, even if the connector payload is filled, I still receives the error from model interface.

{
  "error": {
    "root_cause": [
      {
        "type": "status_exception",
        "reason": "Error validating input schema: Validation failed: [$.parameters: required property 'inputs' not found] for instance: {\"algorithm\":\"REMOTE\",\"parameters\":{\"prompt\":\"hello, who are you?\"},\"action_type\":\"PREDICT\"} with schema: {\n    \"type\": \"object\",\n    \"properties\": {\n        \"parameters\": {\n            \"type\": \"object\",\n            \"properties\": {\n                \"inputs\": {\n                    \"type\": \"string\"\n                }\n            },\n            \"required\": [\n                \"inputs\"\n            ]\n        }\n    },\n    \"required\": [\n        \"parameters\"\n    ]\n}"
      }
    ],
    "type": "status_exception",
    "reason": "Error validating input schema: Validation failed: [$.parameters: required property 'inputs' not found] for instance: {\"algorithm\":\"REMOTE\",\"parameters\":{\"prompt\":\"hello, who are you?\"},\"action_type\":\"PREDICT\"} with schema: {\n    \"type\": \"object\",\n    \"properties\": {\n        \"parameters\": {\n            \"type\": \"object\",\n            \"properties\": {\n                \"inputs\": {\n                    \"type\": \"string\"\n                }\n            },\n            \"required\": [\n                \"inputs\"\n            ]\n        }\n    },\n    \"required\": [\n        \"parameters\"\n    ]\n}"
  },
  "status": 400
}

How can one reproduce the bug?

  1. Create a connector for claude v2 model from the connector blue print. Only change ${parameters.inputs} to ${parameters.prompt}.
POST /_plugins/_ml/connectors/_create
{
  "name": "Amazon Bedrock",
  "description": "Test connector for Amazon Bedrock",
  "version": 1,
  "protocol": "aws_sigv4",
  "credential": {
    "access_key": "...",
    "secret_key": "..."
  },
  "parameters": {
    "region": "...",
    "service_name": "bedrock",
    "model": "anthropic.claude-v2"
  },
  "actions": [
    {
      "action_type": "predict",
      "method": "POST",
      "headers": {
        "content-type": "application/json"
      },
      "url": "https://bedrock-runtime.${parameters.region}.amazonaws.com/model/${parameters.model}/invoke",
      "request_body": """{"prompt":"\n\nHuman: ${parameters.prompt}\n\nAssistant:","max_tokens_to_sample":300,"temperature":0.5,"top_k":250,"top_p":1,"stop_sequences":["\\n\\nHuman:"]}"""
    }
  ]
}
  1. Register model with connector
POST /_plugins/_ml/models/_register
{
  "name": "anthropic.claude-v2",
  "function_name": "remote",
  "description": "test model",
  "connector_id": "6uQTs5IB3hywALe5wWxu"
}
  1. Run model prediction only with inputs parameter. You'll receive connector payload validation error since prompt parameter is needed. This is correct.
POST /_plugins/_ml/models/IY9qt5IB1dQFBoCTo8nv/_predict/
{
  "parameters": {
    "inputs": "hello, who are you?"
  }
} 
  1. Run model prediction only with prompt parameter. You'll receive model interface error.
POST /_plugins/_ml/models/IY9qt5IB1dQFBoCTo8nv/_predict/
{
  "parameters": {
    "prompt": "hello, who are you?"
  }
}

What is the expected behavior?

I should receive the model response as long as the connector payload is filled. I also find it quite weird that the bug won't be reproduced if I omit step 3.

What is your host/environment?

  • OS: MacOS
  • Version main
  • Plugins ml-commons

Do you have any screenshots? If applicable, add screenshots to help explain your problem.

Do you have any additional context? Add any other context about the problem.

yuye-aws avatar Oct 23 '24 03:10 yuye-aws