Improving facial likeness
First, congratulations on this project and thank you for your excellent work!
In my sample generations (which I have attached) you can see that the source face is not preserved very well, i.e. the person in the video does not look like the person in the source image.
I'm wondering, are there any settings I can change to improve this, or any other tips on the best source image to use to preserve facial features?
Thanks again!
https://github.com/magic-research/magic-animate/assets/1686576/c41d7d81-d55c-4fef-a9d1-deef2ef84026
https://github.com/magic-research/magic-animate/assets/1686576/32faa0c8-d0f9-4b9e-8377-fdb130b7ee1c
https://github.com/magic-research/magic-animate/assets/1686576/c6af8f67-a810-433f-9a43-7048e230bd4a
https://github.com/magic-research/magic-animate/assets/1686576/645c169f-d4f4-4e0d-9640-1b37bed3c212
I second that. Face is completely distorted. Is there any parameter that could be adjusted for initial face form preservation?
The clothes is changing as well, I think the clothes are worth taking note of
My go would be, first process video through this software to generate the dancing video then pass through face swapping models like insight-face, or maybe project such as roop/ facefusion, would generate overall better results.
@Pawandeep-prog Thank you very much for your suggestions. I will look into them.
Still, if MagicAnimate could get just a bit better it might be sufficient for my purposes. :pray:
yes the results are not even close to the demo :/
https://github.com/magic-research/magic-animate/assets/4608210/4211ae64-a1b1-4a56-9b1b-e5ea297040bd
I also noticed the same issue and hope there might be a way to resolve it.