onnxruntime_backend icon indicating copy to clipboard operation
onnxruntime_backend copied to clipboard

ONNX Backend Fails to Initialize with String Input

Open dherms opened this issue 3 years ago • 0 comments
trafficstars

Description When attempting to launch a model converted to ONNX with convert_sklearn, the model fails to load with this error:

 UNAVAILABLE: Internal: onnx runtime error 6: Exception during initialization: /workspace/onnxruntime/onnxruntime/core/providers/cpu/nn/string_normalizer.cc:89 onnxruntime::string_normalizer::Locale::Locale(const string&)::<lambda()> Failed to construct locale with name:en_US.UTF-8:locale::facet::_S_create_c_locale name not valid:Please, install necessary language-pack-XX and configure locales

Triton Information What version of Triton are you using? 22.09

Are you using the Triton container or did you build it yourself?

Triton Container

To Reproduce

Run the docker container as suggested in the Triton Server Getting Started page:

docker run --env-file <env_file> -p8000:8000 -p8001:8001 -p8002:8002 --rm --net=host nvcr.io/nvidia/tritonserver:22.09-py3 tritonserver --model-repository=s3://<model_repository>/models

The model is a scikit-learn SGD model (the one developed in this blog post. It takes in an array of strings and returns an array inf integers.

I'm not sure that my config is totally correct, but I don't think this is the source of the issue.

config.pbtxt:

name: "sgd"
platform: "onnxruntime_onnx"

input [
  {
    name: "string_input"
    data_type: TYPE_STRING
    dims: [-1]
  }
]

output [
  {
    name: "output_label"
    data_type: TYPE_INT8
    dims: [-1, 1]
  }
]

Expected behavior Expect model to load successfully and for server to start.

dherms avatar Oct 29 '22 07:10 dherms