training_extensions icon indicating copy to clipboard operation
training_extensions copied to clipboard

NonMaxSuppression parameters are not deployed correctly

Open cansik opened this issue 1 year ago • 4 comments

I trained an object detection model based on YOLOX with otx and exported / optimized it as openvino model. The network includes a NonMaxSuppression operator to post-process the detected objects which is great.

The only problem is that now the score-threshold is fixed to 0.01 and nms is fixed to 0.65 as set in the test_cfg. It seems that the post_processing settings in the deployment.py are ignored and only the test_cfg settings from the model.py are applied. Is this behaviour intended?

But since these values maybe should be adaptive, wouldn't it make sense to expose them as network inputs? Or is there another way to change the thresholds (maybe even on runtime)? I know this is maybe more MMDetection related, but the deployment problem seems to be otx related.

cansik avatar Jul 03 '23 08:07 cansik

Thanks for reporting, it sounds make sense to me. Let me check

jaegukhyun avatar Jul 04 '23 02:07 jaegukhyun

Hi @cansik I've looked into this issue. I think we need more detailed information of your request.

  1. Do you want to change score_thr, nms_threhold during train? If you do, changing model.py in your workspace seems appropriate.
  2. Do you want to change score_thr, nms_threshold during pytorch inference? In this case also changing model.py in your workspace seems appropriate.
  3. Do you want to change score_thr, nms_threshold during ov inference? 3-1. In this case, you can change model.py in your workspace before exporting model. Do you think this is not proper way? 3-2. Maybe you want to change score_thr, nms_threshold for already exported model. If this is your case, more investigation is needed.

jaegukhyun avatar Jul 13 '23 06:07 jaegukhyun

@jaegukhyun In deployment.py I can change various values for the ONNX and OpenVINO export, like input size and other parameters. These parameters are used when I run the otx export command, so it is all about exporting models from PyTorch to ONNX / OpenVINO.

However, the export does not take into account the two parameters confidence_threshold and iou_threshold. This leads to an exported model with some default nms values instead of the configured ones.

cansik avatar Jul 17 '23 07:07 cansik

Hi @cansik. There are misunderstandings for previous guides. OTX collects configurable hyper parameters in https://github.com/openvinotoolkit/training_extensions/blob/develop/src/otx/algorithms/detection/configs/detection/configuration.yaml. This configuration.yaml file will be placed in your workspace if you use otx build or otx train command. You may be able to change some hyperparameters from model.py or deployment.py, not configuration.yaml, but it is not recommended behaviors since we have plan to hide those file. In near future user only can access hyperparameters from configurational.yaml.

In summary, we want user to change hyperparamters using configuration.yaml and template.yaml, and don't recommend changing those parameters using model.py and the other files.

However, current configuration.yaml can only change confidence threshold. Iou_threshold and input_size can be fixed by model.py and data_pipeline.py. https://github.com/openvinotoolkit/training_extensions/pull/2388 is PR for enabling to change confidence threshold during model export or ov model inference. Other parameters such as iou_threshold and input_size should wait until patching through configuration.yaml is supported.

In summary, now confidence threshold can be changed by user when model export and ov model inference, you can check usage of this variable in PR summary. Input size and iou threshold should wait for supporting from configuration.yaml.

jaegukhyun avatar Jul 27 '23 01:07 jaegukhyun