DiffPortrait3D
DiffPortrait3D copied to clipboard
Official Repository of [CVPR'24 Highlight Diffportrait3D: Controllable Diffusion for Zero-Shot Portrait View Synthesis]
[CVPR'24 Highlight]DiffPortrait3D: Controllable Diffusion for Zero-Shot Portrait View Synthesis
Yuming Gu1,2
·
You Xie2
·
Hongyi Xu2
·
Guoxian Song2
·
Yichun Shi2
·
Di Chang1,2
·
Jing Yang1
·
Linjie Luo2
·
1University of Southern California 2ByteDance Inc.
News
-
[2024.03.18] Release code.
-
[2024.02.26] Congratulations to our team! Our paper has been accepted to CVPR2024 Highlight, see you in Seattle!
-
[2023.12.28] Release Diffportrait3D paper and project page.
Getting Started
For pretrained checkpoint please download from google drive from here.
Place the pretrained weights as following:
DiffPortrait3D
|----checkpoints
|----model_state-540000-001.th
Environment
The environment from my machine is Python 3.8.5 CUDA 11.7. It's possible to have other compatible version.
conda env create -f diffportrait3D.yml
conda activate diffportrait3D
We test our code on NVIDIA V100, NVIDIA A100, NVIDIA A6000.
Inference with small example_data:
bash script/CVPR_Inference/inference_sample.sh
Inference with your own in-the-wild data:
Due to company IP Policy, we cannot release the 3D aware noise model. In this case, we highly encourage you to acheive the 3D Aware Noise from other pretrained 3D GAN method. Models like GOAE, Triplanenet could also be a very good 3D-aware noise initial. Please also refer to EG3D to generate aligned camera condition.
Citing
If you find our work useful, please consider citing:
@misc{gu2023diffportrait3d,
title={DiffPortrait3D: Controllable Diffusion for Zero-Shot Portrait View Synthesis},
author={Yuming Gu and Hongyi Xu and You Xie and Guoxian Song and Yichun Shi and Di Chang and Jing Yang and Lingjie Luo},
year={2023},
eprint={2312.13016},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
Development
This research reference implementation is treated as a one-time code drop. Therefore, we may be slow in accepting external code contributions through pull requests.
Acknowledgments
Our code follows several excellent repositories. We appreciate them for making their codes available to the public.
