anomalib icon indicating copy to clipboard operation
anomalib copied to clipboard

Padim inference on custom dataset throws Value Error

Open shrinand1996 opened this issue 3 years ago • 2 comments

Describe the bug I trained padim on custom datset for classification purpose. The model trained well but i cannot infer using inference.py. When i run inference.py, it throws a value error as follows: ValueError: could not broadcast input array from shape (14,195,3) into shape (14,108,3)

To Reproduce Steps to reproduce the behavior:

  1. Train a custom padim classifier on custom dataset.
  2. run tools/inference.py
  3. See error

Expected behavior Should work flawlessly the way it works in segmentation

Screenshots

  • If applicable, add screenshots to help explain your problem.

Hardware and Software Configuration Google Colab

Additional context /usr/local/lib/python3.7/dist-packages/torchmetrics/utilities/prints.py:36: UserWarning: Torchmetrics v0.9 introduced a new argument class property called full_state_update that has not been set for this class (AdaptiveThreshold). The property determines if update by default needs access to the full metric state. If this is not the case, significant speedups can be achieved and we recommend setting this to False. We provide an checking function from torchmetrics.utilities import check_forward_no_full_state that can be used to check if the full_state_update=True (old and potential slower behaviour, default for now) or if full_state_update=False can be used safely.

warnings.warn(*args, **kwargs) /usr/local/lib/python3.7/dist-packages/torchmetrics/utilities/prints.py:36: UserWarning: Metric PrecisionRecallCurve will save all targets and predictions in buffer. For large datasets this may lead to large memory footprint. warnings.warn(*args, **kwargs) /usr/local/lib/python3.7/dist-packages/torchmetrics/utilities/prints.py:36: UserWarning: Torchmetrics v0.9 introduced a new argument class property called full_state_update that has not been set for this class (AnomalyScoreDistribution). The property determines if update by default needs access to the full metric state. If this is not the case, significant speedups can be achieved and we recommend setting this to False. We provide an checking function from torchmetrics.utilities import check_forward_no_full_state that can be used to check if the full_state_update=True (old and potential slower behaviour, default for now) or if full_state_update=False can be used safely.

warnings.warn(*args, **kwargs) /usr/local/lib/python3.7/dist-packages/torchmetrics/utilities/prints.py:36: UserWarning: Torchmetrics v0.9 introduced a new argument class property called full_state_update that has not been set for this class (MinMax). The property determines if update by default needs access to the full metric state. If this is not the case, significant speedups can be achieved and we recommend setting this to False. We provide an checking function from torchmetrics.utilities import check_forward_no_full_state that can be used to check if the full_state_update=True (old and potential slower behaviour, default for now) or if full_state_update=False can be used safely.

warnings.warn(*args, **kwargs) Traceback (most recent call last): File "tools/inference.py", line 170, in stream() File "tools/inference.py", line 126, in stream infer(args.image_path, inferencer, args.save_path, args.overlay_mask) File "tools/inference.py", line 152, in infer output = add_label(anomaly_map, score) File "tools/inference.py", line 84, in add_label prediction[: baseline + height, : baseline + width] = label_patch ValueError: could not broadcast input array from shape (14,195,3) into shape (14,108,3)

shrinand1996 avatar Jun 14 '22 09:06 shrinand1996

Hi @shrinand1996, can you share your config file so we could have a look at the details.

samet-akcay avatar Jun 15 '22 14:06 samet-akcay

@shrinand1996, can you check this with the new inferencer we recently introduced?

samet-akcay avatar Jul 11 '22 16:07 samet-akcay

Closing due to inactivity. Feel free to reopen if the problem persists.

djdameln avatar Aug 29 '22 14:08 djdameln