anomalib
anomalib copied to clipboard
[Bug]: Results of (padding=True, pad_maps=False) are worse than (padding=False, pad_maps=True) of EfficientAD
Describe the bug
When I trained EfficientAD on my own dataset, I found results of (padding=True, pad_maps=False) are worse than default setting, I am confused about this problem. results is below.
Dataset
Folder
Model
Other (please specify in the field below)
Steps to reproduce the behavior
Create the datamodule
datamodule = Folder( name="white_point", root="/home/data/QiangBan/white_point/train", normal_dir="good", test_split_mode=TestSplitMode.SYNTHETIC )
datamodule = MVTec()
Setup the datamodule
datamodule.setup() model = EfficientAd( model_size=EfficientAdModelSize.S, #padding=True, #pad_maps=False )
prepare tiling configuration callback
#tiler_config_callback = TilerConfigurationCallback(enable=True, tile_size=[512, 512], stride=64)
pass the tiling configuration callback to engine
engine = Engine(image_metrics=["AUROC"], pixel_metrics=["AUROC"], max_epochs=100)
train the model (tiling is seamlessly utilized in the background)
engine.fit(datamodule=datamodule, model=model)
OS information
OS information:
- OS: [e.g. Ubuntu 20.04]
- Python version: [3.10.0]
- Anomalib version: [1.0.1 latest]
- PyTorch version: [2.0.0]
- CUDA/cuDNN version: [e.g. 11.8]
- GPU models and configuration: [1x GeForce RTX 3090]
- Any other relevant information: [I'm using a custom dataset]
Expected behavior
Get the results of default configuration of model with 4 pixels pad is code.
Screenshots
No response
Pip/GitHub
GitHub
What version/branch did you use?
latest
Configuration YAML
Not use yaml, use python file to train. codes is above.
Logs
No
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
What about set both padding and pad_map to 'True'
From an qualitative view, (padding=False, pad_maps=True) gives you worse results because it misses all the anomalies in the border region of your image. However I think anomalies are well detected in your custom data, the reason for the too large anomaly area might also lie in disadvantageous segmentation mask labeling
What about set both padding and pad_map to 'True'
I don't try this config, According to the results of padding =True, I think it can't be well detected.
From an qualitative view, (padding=False, pad_maps=True) gives you worse results because it misses all the anomalies in the border region of your image. However I think anomalies are well detected in your custom data, the reason for the too large anomaly area might also lie in disadvantageous segmentation mask labeling
The result of (padding=False, pad_maps=True) is better. By analysing the pretrained model, padding = True will lead the size of model from 256256 to 259259, the size is increased by convolution layer. I think this is the reason.
The output of (padding=False, pad_maps=True) is 5656, pad_maps=True has 4 pixels padding on the bord. otherwise size of output is 6464。
Im not so sure about:
The result of (padding=False, pad_maps=True) is better.
Because of the pad_maps you have some border regions of the image where anomalies are not detected. In your case this affects even a separate anomaly in the top right border for example
Im not so sure about:
The result of (padding=False, pad_maps=True) is better.
Because of the pad_maps you have some border regions of the image where anomalies are not detected. In your case this affects even a separate anomaly in the top right border for example
Yes, Even though the result of (padding=False, pad_maps=True) can detect the bord of image, it will lead to more backgroud False Positive, I think may be the distillation pretrained model is important,(I am not sure the pretrained model is distilled without pad or not ?), may be need to do some experiments to prove.