Jianzhu Guo
Jianzhu Guo
It's challenging : |
A temporary solution is use `change_video_fps` to change the FPS of the original video to an interger, e.g., 30 https://github.com/KwaiVGI/LivePortrait/blob/7f755d93b6e120d5ab6bdc2f08d814b5b6935ad7/src/utils/video.py#L133-L135
Not tested on Intel GPU. If pytorch supports Inter GPU, I think it will work. A reference of AMD GPU: https://github.com/KwaiVGI/LivePortrait/discussions/482
More keypoints maybe better, but its hard to specify the best number of keypoints. Honestly, non-diffusion based methods are hard to simulate the hair swaying.
I think you should refer to the https://github.com/kijai/ComfyUI-LivePortraitKJ repo : ) One suggestion I can think of is to revert to a previous working version and then compare the diff.
It appears that we didn’t provide a quantification for it. You can refer to [LivePortrait paper](https://arxiv.org/pdf/2407.03168) for more details.