Time-Flies
Time-Flies copied to clipboard
Time Flies: Animating a Still Image with Time-Lapse Video as Reference, CVPR2020
Time Flies: Animating a Still Image with Time Lapse Video as Reference (CVPR 2020 accepted)
PyTorch implementaton of the following paper. In this paper, we propose a self-supervised end-to-end model to generate the timelapse video from a single image and a reference video.

Paper
Time Flies: Animating a Still Image With Time-Lapse Video As Reference
Chia-Chi Cheng, Hung-Yu Chen, Wei-Chen Chiu
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2020.
Please cite our paper if you find it useful for your research.
@InProceedings{Cheng_2020_CVPR,
author = {Cheng, Chia-Chi and Chen, Hung-Yu and Chiu, Wei-Chen},
title = {Time Flies: Animating a Still Image With Time-Lapse Video As Reference},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}
Example Results
Installation
- This code was developed with Python 3.6.9 & Pytorch 1.0.0 & CUDA 10.1.
- Other requirements: cv2, numpy, natsort
- Clone this repo
git clone https://github.com/angelwmab/Time-Flies.git
cd Time-Flies
Testing
Download our pretrained models from here and put them under models/
.
Run the sample data provided in this repo:
python test.py
Run your own data:
python test.py --vid_dir YOUR_REF_VID_FRAME_PATH
--seg_dir YOUR_SEGMENTATION_PATH
--target_img_path YOUR_TARGET_IMG_PATH
Training
Download the webcamclipart dataset here and put them under webcamclipart/
.
Download the segmentation maps of each scene here and put them under segmentations/
.
Then you can directly run the training code:
python train.py
Train the model with your own dataset:
python train.py --vid_dir YOUR_REF_VID_DATASET
--seg_dir YOUR_SEGMENTATION_DIR