TideFinder
TideFinder
Hi, thanks for help. When I use 700x1200 it takes 15s to get prediction. I only check during inference so instantiation model does not be a reason for too much...
Okay, as you mentioned, it is maybe due to use openvino in cpu option. So I tested torch_inferencer, now it is really faster. with 700x1200, I got 0.6 for prediction...
@samet-akcay Hi. I recentely switched inference module to TorchInferencer and it seems like the issue has been addressed. However, still does not know why it gets extremely slower when I...
What about set both padding and pad_map to 'True'