ml-commons
ml-commons copied to clipboard
[BUG] Invalid JSON in payload error despite sending a valid JSON with all required parameters
What is the bug? Sending a predict request to a model that uses SageMaker connector like so
POST /_plugins/_ml/models/_6gdD40BZqSAbrEiV6DT/_predict
{
"parameters": {
"inputs": "test sentence"
}
}
produces an error
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Invalid JSON in payload"
}
],
"type": "illegal_argument_exception",
"reason": "Invalid JSON in payload"
},
"status": 400
}
despite having a complete and valid JSON.
How can one reproduce the bug? Steps to reproduce the behavior:
- Setup a connector to a SageMaker endpoint (any endpoint is okay since the error is in the connector itself)
- Deploy the model and note the model ID
- Predict
curl -XPOST "http://localhost:9200/_plugins/_ml/models/<Model ID>/_predict" -H 'Content-Type: application/json' -d' { "parameters": { "inputs": "test sentence" } }'
- See error
What is the expected behavior?
The request should trigger processing of request_body
in the connector without any errors.
What is your host/environment?
- AWS OpenSearch Service 2.11
- ML Commons
Do you have any additional context?
Passing "inputs": ["test sentence"]
works. However, I need the embedding on just the sentence without the extra square brackets. Moreover, ${parameters.input}[0]
works without any errors but gave different embedding for a test sentence than what was expected.
Hi @NeuralFlux works well for me. May be some problem in request_body while creating connector
It should be
"request_body": "{ \"inputs\": \"${parameters.inputs}\" }"
I realized a JSON has different standards for being "valid". May I know which standard is used to check the payload? Plain strings are valid JSON documents according to RFC 7159 and RFC 8259.
Hi @NeuralFlux, could you please share your connector configuration?
Sure thing, it's
"actions": [
{
"action_type": "predict",
"method": "POST",
"headers": {
"content-type": "application/x-text"
},
"url": "<INFERENCE_ENDPOINT>",
"request_body": "${parameters.inputs}"
}
]
this might be resolved using model interface https://github.com/opensearch-project/ml-commons/issues/2354