model_analyzer
model_analyzer copied to clipboard
Model load failed : [StatusCode.INTERNAL]
Hello, I am trying to run the model_analyzer on an xgboost model I have. I am able to load the docker sdk client container and submit a model analyzer command, but I keep getting the following error for all the auto-generated models.
[Model Analyzer] WARNING: Overriding the output model repo path "/home/dvanstee/data/projects/2022-07-tritonDemo/fil_demo/model_repository/ma_out1"
[Model Analyzer] Starting a Triton Server using docker
[Model Analyzer] Loaded checkpoint from file /tmp/ckpts/1.ckpt
[Model Analyzer] Profiling server only metrics...
[Model Analyzer] Stopped Triton Server.
[Model Analyzer]
[Model Analyzer] Creating model config: xgboost_classifier_config_default
[Model Analyzer]
[Model Analyzer] Model xgboost_classifier_config_default load failed: [StatusCode.INTERNAL] failed to load 'xgboost_classifier_config_default', failed to poll from model repository
Here is how I start docker
docker run -it --rm --net=host \
-v /var/run/docker.sock:/var/run/docker.sock \
-v ${BASE}/model_repository:/models \
-v ${BASE}/model_repository:/${BASE}/model_repository \
-v ${BASE}:/notebooks/fil_demo/ \
nvcr.io/nvidia/tritonserver:22.06-py3-sdk
Here is how I run the model analyzer..
model-analyzer profile \
--checkpoint-directory /tmp/ckpts \
--model-repository /model_repository \
--triton-launch-mode=docker \
--triton-docker-image nvcr.io/nvidia/tritonserver:22.06-py3 \
--triton-docker-mounts /model_repository:/model_repository:rw \
--profile-models xgboost_classifier \
--override-output-model-repository \
--output-model-repository-path /model_repository/ma_out1
Here is what I see in the output directory
/model_repository/ma_out1$ ll
total 36
drwxrwxrwx 6 dvanstee dvanstee 4096 Aug 1 12:35 ..
drwxr-xr-x 3 root root 4096 Aug 1 12:35 xgboost_classifier_config_default
drwxr-xr-x 2 root root 4096 Aug 1 12:35 xgboost_classifier_config_0
drwxr-xr-x 2 root root 4096 Aug 1 12:35 xgboost_classifier_config_1
drwxr-xr-x 2 root root 4096 Aug 1 12:35 xgboost_classifier_config_2
drwxr-xr-x 2 root root 4096 Aug 1 12:35 xgboost_classifier_config_3
drwxr-xr-x 2 root root 4096 Aug 1 12:35 xgboost_classifier_config_4
drwxr-xr-x 9 root root 4096 Aug 1 12:35 .
drwxr-xr-x 2 root root 4096 Aug 1 12:35 xgboost_classifier_config_5
I am able to run inference, and also use perf_analyzer, but I cant seem to figure out the right syntax for model_analyzer.
I was wondering if anyone can see why I cant seem to get the models that are automatically generated to load.
Any ideas for me ?