Deep-Hash-Distillation icon indicating copy to clipboard operation
Deep-Hash-Distillation copied to clipboard

Deep Hash Distillation for Image Retrieval - ECCV 2022

Deep Hash Distillation for Image Retrieval

Official Pytorch implementation of "Deep Hash Distillation for Image Retrieval" Accepted to ECCV2022 - DHD

Overall training procedure of DHD

Requirements

Prepare requirements by following command.

pip install -r requirements.txt

Train DHD models

Prepare datasets

We use public benchmark datasets: ImageNet, NUS-WIDE, MS COCO. Image file name and corresponding labels are provided in ./data.

Datasets can be downloaded here: NUS-WIDE / MS COCO

For ImageNet, please download through official website ImageNet and follow our data configuration.

Example

  • Train DHD model with ImageNet, AlexNet backbone, 64-bit, temperature scaling with 0.2
  • python main_DHD.py --dataset=imagenet --encoder=AlexNet --N_bits=64 --temp=0.2

python main_DHD.py --help will provide detailed explanation of each argument.

Retrieval Results with Different Backbone

S: Swin Transformer, R: ResNet, A: AlexNet

ImageNet

NUS-WIDE

MS COCO

Citation

@inproceedings{DHD,
  title={Deep Hash Distillation for Image Retrieval},
  author={Young Kyun Jang, Geonmo Gu, Byungsoo Ko, Isaac Kang, Nam Ik Cho},
  booktitle={Proceedings of the European Conference on Computer Vision (ECCV)},
  year={2022}
}