anomalib icon indicating copy to clipboard operation
anomalib copied to clipboard

[Bug]: Results of (padding=True, pad_maps=False) are worse than (padding=False, pad_maps=True) of EfficientAD

Open jiamfeng opened this issue 10 months ago • 6 comments

Describe the bug

When I trained EfficientAD on my own dataset, I found results of (padding=True, pad_maps=False) are worse than default setting, I am confused about this problem. results is below. image

Dataset

Folder

Model

Other (please specify in the field below)

Steps to reproduce the behavior

Create the datamodule

datamodule = Folder( name="white_point", root="/home/data/QiangBan/white_point/train", normal_dir="good", test_split_mode=TestSplitMode.SYNTHETIC )

datamodule = MVTec()

Setup the datamodule

datamodule.setup() model = EfficientAd( model_size=EfficientAdModelSize.S, #padding=True, #pad_maps=False )

prepare tiling configuration callback

#tiler_config_callback = TilerConfigurationCallback(enable=True, tile_size=[512, 512], stride=64)

pass the tiling configuration callback to engine

engine = Engine(image_metrics=["AUROC"], pixel_metrics=["AUROC"], max_epochs=100)

train the model (tiling is seamlessly utilized in the background)

engine.fit(datamodule=datamodule, model=model)

OS information

OS information:

  • OS: [e.g. Ubuntu 20.04]
  • Python version: [3.10.0]
  • Anomalib version: [1.0.1 latest]
  • PyTorch version: [2.0.0]
  • CUDA/cuDNN version: [e.g. 11.8]
  • GPU models and configuration: [1x GeForce RTX 3090]
  • Any other relevant information: [I'm using a custom dataset]

Expected behavior

Get the results of default configuration of model with 4 pixels pad is code.

Screenshots

No response

Pip/GitHub

GitHub

What version/branch did you use?

latest

Configuration YAML

Not use yaml, use python file to train. codes is above.

Logs

No

Code of Conduct

  • [X] I agree to follow this project's Code of Conduct

jiamfeng avatar Apr 18 '24 12:04 jiamfeng

What about set both padding and pad_map to 'True'

papago2355 avatar Apr 22 '24 05:04 papago2355

From an qualitative view, (padding=False, pad_maps=True) gives you worse results because it misses all the anomalies in the border region of your image. However I think anomalies are well detected in your custom data, the reason for the too large anomaly area might also lie in disadvantageous segmentation mask labeling

alexriedel1 avatar Apr 22 '24 14:04 alexriedel1

What about set both padding and pad_map to 'True'

I don't try this config, According to the results of padding =True, I think it can't be well detected.

jiamfeng avatar Apr 24 '24 11:04 jiamfeng

From an qualitative view, (padding=False, pad_maps=True) gives you worse results because it misses all the anomalies in the border region of your image. However I think anomalies are well detected in your custom data, the reason for the too large anomaly area might also lie in disadvantageous segmentation mask labeling

The result of (padding=False, pad_maps=True) is better. By analysing the pretrained model, padding = True will lead the size of model from 256256 to 259259, the size is increased by convolution layer. I think this is the reason.
The output of (padding=False, pad_maps=True) is 5656, pad_maps=True has 4 pixels padding on the bord. otherwise size of output is 6464。

jiamfeng avatar Apr 24 '24 11:04 jiamfeng

Im not so sure about:

The result of (padding=False, pad_maps=True) is better.

Because of the pad_maps you have some border regions of the image where anomalies are not detected. In your case this affects even a separate anomaly in the top right border for example image

alexriedel1 avatar Apr 25 '24 08:04 alexriedel1

Im not so sure about:

The result of (padding=False, pad_maps=True) is better.

Because of the pad_maps you have some border regions of the image where anomalies are not detected. In your case this affects even a separate anomaly in the top right border for example image

Yes, Even though the result of (padding=False, pad_maps=True) can detect the bord of image, it will lead to more backgroud False Positive, I think may be the distillation pretrained model is important,(I am not sure the pretrained model is distilled without pad or not ?), may be need to do some experiments to prove.

jiamfeng avatar Apr 26 '24 02:04 jiamfeng