StableKeypoints
StableKeypoints copied to clipboard
Unsupervised Keypoints from Pretrained Diffusion Models
Eric Hedlin, Gopal Sharma, Shweta Mahajan, Xingzhe He, Hossam Isack, Abhishek Kar, Helge Rhodin, Andrea Tagliasacchi, Kwang Moo Yi
Project Page
For more detailed information, visit our project page or read our paper
Requirements
Set up environment
Create a conda environment using the provided requirements.yaml:
conda env create -f requirements.yaml
conda activate StableKeypoints
Download datasets
The CelebA, Taichi, Human3.6m, DeepFashion, and CUB datasets can be found on their websites.
Preprocessed data for CelebA, and CUB can be found in Autolink's repository.
Usage
To use the code, run:
python3 -m unsupervised_keypoints.main [arguments]
Main Arguments
--dataset_loc: Path to the dataset.--dataset_name: Name of the dataset.--num_steps: Number of steps (default 500, up to 10,000 for non-human datasets).--evaluation_method: Following baselines, the evaluation method varies by dataset:- CelebA: 'inter_eye_distance'
- CUB: 'visible'
- Taichi: 'mean_average_error' (renormalized per keypoint)
- DeepFashion: 'pck'
- Human3.6M: 'orientation_invariant'
--save_folder: Output save location (default "outputs" inside the repo).
Example Usage
python3 -m unsupervised_keypoints.main --dataset_loc /path/to/dataset --dataset_name celeba_wild --evaluation_method inter_eye_distance --save_folder /path/to/save
If you want to use a custom dataset you can run
python3 -m unsupervised_keypoints.main --dataset_loc /path/to/dataset --dataset_name custom
Precomputed tokens
We provide the precomputed tokens here
BibTeX
@article{hedlin2023keypoints,
title={Unsupervised Keypoints from Pretrained Diffusion Models},
author={Hedlin, Eric and Sharma, Gopal and Mahajan, Shweta and He, Xingzhe and Isack, Hossam and Rhodin, Abhishek Kar Helge and Tagliasacchi, Andrea and Yi, Kwang Moo},
journal={arXiv preprint arXiv:2312.00065},
year={2023}
}