generatedTIR_tracking
generatedTIR_tracking copied to clipboard
Synthetic data generation for end-to-end TIR tracking (TIP2018)
Synthetic data generation for end-to-end TIR tracking [paper]
Citation
Please cite our paper if you are inspired by this idea.
@article{zhang2018synthetic,
title={Synthetic data generation for end-to-end thermal infrared tracking},
author={Zhang, Lichao and Gonzalez-Garcia, Abel and van de Weijer, Joost and Danelljan, Martin and Khan, Fahad Shahbaz},
journal={IEEE Transactions on Image Processing},
volume={28},
number={4},
pages={1837--1850},
year={2018},
publisher={IEEE}
}
Instructions
This project is to transfer RGB tracking videos to TIR tracking videos in order to complement the TIR data for training. We give two kinds of models corresponding for two-stage of our porject (The transferring stage and the fine-tuning stage).
Analysis for RGB and TIR
Results for the two image translation methods considered: pix2pix and CycleGAN. On the test set of KAIST[1].
The left is the Average activation of filters from the first layer of pre-trained AlexNet. The right is the Histogram of the gradient magnitude for real and synthetic TIR data.
Models
-
Download generated models:
The unifid project for both pix2pix and CycleGAN is in the link.
-
Video examples for transferred models (from left to right: RGB, ground-truth, pix2pix, CycleGAN) :
-
Download fine-tuned models (after download, put them in the file ECO_tir/feature_extraction/networks):
Results
-
Download results to compare:
References
[1] Hwang, Soonmin and Park, Jaesik and Kim, Namil and Choi, Yukyung and So Kweon, In.
Multispectral pedestrian detection: Benchmark dataset and baseline.
In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2015.
Contact
For further inquries please contact with me: [email protected]. Or submit a bug report on the Github site of the project.