controlnetvideo
controlnetvideo copied to clipboard
Improve performance
Hi, It takes a long time (32 min) using this controlnets lineart21, openpose21 ,canny21. To generate an 8 second video with the one I am testing. @un1tz3r0 Could it be that you can attach multiprocessing when applying the filters to the frames?
Maybe multiples models are needed to do this?
My config:
!python3 controlnetvideo/controlnetvideo.py \
controlnetvideo/data/video.mp4 \
--controlnet openpose21 \ # lineart21, openpose21 ,canny21
--prompt 'Male gladiator observing his fighting blade' \
--prompt-strength 9 \
--show-input \
--show-detector \
--show-motion \
--dump-frames '{instem}_frames/{n:08d}.png' \
--init-image-strength 0.4 \
--color-amount 0.3 \
--feedthrough-strength 0.001 \
--show-output \
--num-inference-steps 15 \
--skip-dumped-frames \
'{instem}_out.mp4'
My video (217 frames): https://www.pexels.com/video/man-texting-on-the-street-855574/