E2FGVI icon indicating copy to clipboard operation
E2FGVI copied to clipboard

Resolution or video length lead to cuda out of memory question

Open davidzang0930 opened this issue 1 year ago • 8 comments

I use e2fgvi_hq model

  1. In the input part, if the resolution is adjusted from 432240 to 648360 or higher, there will be cuda out of memory problem
  2. Fixing the resolution to 432*240 and adjusting the input video length to 10 seconds or higher will also cause cuda out of memory problems I would like to ask, is there no way to input a higher resolution or longer video? Or am I using it wrong

davidzang0930 avatar May 05 '23 17:05 davidzang0930

Bro. The model is hungry for your gpu

LownyCGI avatar Jun 02 '23 18:06 LownyCGI

@davidzang0930 did you find an answer for this?

antithing avatar Jul 06 '23 14:07 antithing

@davidzang0930 did you find an answer for this?

Guys what are trying to do? The model is expensive to run

LownyCGI avatar Jul 06 '23 22:07 LownyCGI

I was wondering if there were any settings to adjust that would use less memory at the cost of inference time.

I found this repo:

https://github.com/Teravus/Chunk_E2FGVI

which allows for more frames at a time, so I am using that one for now.

antithing avatar Jul 07 '23 13:07 antithing

@antithing I use the .half() function to allow it to use more frames, and I put selected_imgs and selected_masks into the GPU after selecting them. This way, it can process longer videos.

davidzang0930 avatar Jul 08 '23 05:07 davidzang0930

@davidzang0930 thanks! Are you able to share your code changes here?

antithing avatar Jul 08 '23 19:07 antithing

@antithing Still adjusting, try to see if it can be better optimized However, the above can initially reduce the demand You can modify test.py

davidzang0930 avatar Jul 10 '23 09:07 davidzang0930

Hi @davidzang0930 did you find any other ways to optimize it? If so can you please share?

smandava98 avatar Sep 04 '23 09:09 smandava98