tfdeploy::serve_savedmodel predictions work locally, on RSC server they return {"error":"Error while attempting to parse predict request: json: cannot unmarshal number into Go value of type []float32"}
Hello, not sure how relevant to tfdeploy this is but just wanted to point it out.
I've trained my own NN and successfully tested local deployment with the following json:
{ "instances": [ { "input_text": [0, 0, 0, 0, 0, 0, 0, 163, 3503, 82], "input_other": [-0.0059, 0, 0] }, { "input_text": [0, 0, 0, 0, 0, 0, 0, 163, 3503, 82], "input_other": [-0.0059, 0, 0] } ] }
Subsequently the model is deployed to the latest version of R Studio Connect where it is running without problems. However, when I'm trying to make a prediction on the exact same input json I receive the following message:
400; {"error":"Error while attempting to parse predict request: json: cannot unmarshal number into Go value of type []float32"}
Do I need to perform any additional steps before making that prediction on RSC?
Yes, you probably need to wrap the entries under a list() since predict_savedmodel() expects a list of entries to predict.
For example, first train a model as specified in the Introduction to tfdeploy article, then deploy to RStudio Connect as follows:
rsconnect::deployTFModel("savedmodel")
After deploying to RStudio Connect, the predictions can be performed as follows:
tfdeploy::predict_savedmodel(
list(list(rep(0.0, 784))),
"https://path-to-rstudio-connect/content/2058/predict"
)
Prediction 1:
[1] 0.03622832
Prediction 2:
[1] 0.05154019
Prediction 3:
[1] 0.06207469
Prediction 4:
[1] 0.1104403
Prediction 5:
[1] 0.03014009
Prediction 6:
[1] 0.4445479
Prediction 7:
[1] 0.03220431
Prediction 8:
[1] 0.08737313
Prediction 9:
[1] 0.06066281
Prediction 10:
[1] 0.08478825
Does this help?
My call was a list from the very beginning, assembled in the following way from one particular example that I used for training the model:
pred_example_formatted <- list(
instances =
list(
list(
input_text = c(
pred_example[[1]]
),
input_other = c(
pred_example[[2]]
)
),
list(
input_text = c(
pred_example[[1]]
),
input_other = c(
pred_example[[2]]
)
)
)
)
When I use this input with tfdeploy::serve_savedmodel("my_model", browse = TRUE) to test it locally everything works. The problem persists after the model is deployed to RSC and I try to use the same, formatted input to make a prediction against the server with a Curl.
I already read that article but it doesn't specify how I should make a prediction with:
tfdeploy::predict_savedmodel(
list(list(rep(0.0, 784))),
"https://path-to-rstudio-connect/content/2058/predict"
)
when I need to pass additional authentication parameters? When I use it's plain form it's saying that the content can't be found. I tried the following but none of that helped (the url is surely correct):
tfdeploy::predict_savedmodel(
pred_example_formatted,
"my_model_url",
type = "webapi",
httr::add_headers(
Authorization = "xxx"
),
httr::accept_json(),
httr::content_type("application/json")
)
@javierluraschi another question related to tfdeploy::predict_savedmodel after I do tfdeploy::serve_savedmodel. So my input has the following structure:
df_seq <- list(
instances = list(
input_text = input_text[1:5],
input_other = input_other[1:5]
)
)
which looks like this:
$instances
$instances$input_text
$instances$input_text[[1]]
[1] 0 0 2136 137 63 133 96 607 734 400
$instances$input_text[[2]]
[1] 526 527 6 7 836 44 57 324 837 54
$instances$input_text[[3]]
[1] 3 691 8 21 3335 40 473 578 335 74
$instances$input_text[[4]]
[1] 0 0 0 0 0 1341 1523 75 42 74
$instances$input_text[[5]]
[1] 0 0 0 0 192 2387 82 7 264 12
$instances$input_other
$instances$input_other[[1]]
[1] 0.001158364 1.000000000 0.000000000 0.000000000 0.000000000 0.892956149 0.902510499 0.697446330 0.000000000 5.513473994 0.544115622
[12] -0.208037276 -0.732371689 0.952305391 0.847779321 0.000000000 -0.806740932 0.160529040 -0.663567084 0.063788885 -0.107492585 -0.027918154
[23] 0.000000000 0.000000000 0.000000000 -0.022229750 -0.082435554
$instances$input_other[[2]]
[1] -0.04306745 0.00000000 0.00000000 0.00000000 0.00000000 1.46732274 0.90251050 1.42793931 0.00000000 0.60700686 0.95879782 -0.08902028
[13] 0.48611218 0.79584350 0.62159309 0.00000000 1.96582131 -0.95761424 0.42473171 0.06378888 -0.10749258 -0.02791815 0.00000000 0.00000000
[25] 0.00000000 -0.02222975 -0.08243555
$instances$input_other[[3]]
[1] -0.03576120 0.00000000 0.00000000 0.00000000 0.00000000 1.82964008 0.90251050 1.82069381 0.00000000 0.60700686 1.27343890 0.04690398
[13] 0.48611218 0.99952289 0.49326439 0.00000000 2.27990479 -1.46696809 0.52993286 0.06378888 -0.10749258 -0.02791815 0.00000000 0.00000000
[25] 0.00000000 -0.02222975 -0.08243555
$instances$input_other[[4]]
[1] -0.02584746 0.00000000 0.00000000 0.00000000 0.00000000 0.19228020 0.90251050 -2.92155927 0.00000000 0.60700686 0.91611717 0.99914429
[13] 0.48611218 0.99952289 0.27581282 0.00000000 0.38733663 0.16904957 -1.92046757 0.06378888 -0.10749258 -0.02791815 0.00000000 0.00000000
[25] 0.00000000 -0.02222975 -0.08243555
$instances$input_other[[5]]
[1] 0.04692546 1.00000000 0.00000000 0.00000000 0.00000000 0.33849751 0.90251050 0.38287096 0.00000000 0.60700686 -0.86919448 -1.02507085
[13] 1.19887954 0.73782810 0.35208237 0.00000000 -0.80674093 0.10360053 -0.95264358 0.06378888 -0.10749258 -0.02791815 0.00000000 0.00000000
[25] 0.00000000 -0.02222975 -0.08243555
As you can see there are 5 different instances over there for which I need predictions. However, when I now call the following function:
(prediction <- tfdeploy::predict_savedmodel(
df_seq,
"http://127.0.0.1:8089/serving_default/predict/"
))
I always only get the top one:
> prediction
$dense_3
[1] 1.0000e+00 3.6489e-24 9.2066e-16 5.2888e-17 1.5738e-18 1.5753e-17 2.6910e-23 1.2752e-23 1.1595e-21 2.9676e-29 2.4302e-24 4.2637e-18 7.2284e-24
[14] 1.3259e-22 8.4549e-18