detector object out of memory
Hello,
I am using detector.detect_video() so process videos with 4500 frames (3 min duration). When the object is done processing each frame, it performs some additional computations which are extremely memory intensive, requiring up to 82 GB.
Is it possible to turn the computation of these additional steps off? I just need the AUs.
Thank you
Martin
Or at least do them in a slow, but memory-possible way? I have videos about 20 minutes each, by that logic they'll require several Tb at least.
Or at least do them in a slow, but memory-possible way? I have videos about 20 minutes each, by that logic they'll require several Tb at least.
how many video you have? and how many hours your vid being processed? its 12 minutes for me but it estimated of 3 hours @Allexxann
Or at least do them in a slow, but memory-possible way? I have videos about 20 minutes each, by that logic they'll require several Tb at least.
how many video you have? and how many hours your vid being processed? its 12 minutes for me but it estimated of 3 hours @Allexxann
Hundreds of recordings. The short ones are about 12-14 minutes, the long ones are up to 30 (they are usually bundled with each other).
Though I use a modified algorhitm - it adds extra features, but quadruples the time, so I have to snap a frame every second (or even two) and analyze these (and still it took me a literal week of raw computing time to process a sample from a single experimental series).
Hello! Great work. By the way, does it get stuck during video detection? How did you solve the problem when detecting Action Units (AU)? My 12 - second video keeps getting stuck and can't proceed. Thank you.
Hello! Great work. By the way, does it get stuck during video detection? How did you solve the problem when detecting Action Units (AU)? My 12 - second video keeps getting stuck and can't proceed. Thank you.
What do you mean getting stuck? Can you perhaps supply more details?
Just like this, the progress bar keeps getting stuck here. I had a 12-second video that could be stuck for 24 hours without moving. I have no idea what the reason is.
this is my issues
@mlt94 did you solve it?
I solve it by changing some codes in detector.py file
- I comment the line 710 "batch_output.compute_identities(...)" to stop calculations
- I give
identity_model = Nonefor Detecor initiation, by doing so,idntity_detectoris also None - I also comment line 434 because it creates a 512-element data in torch if
idntity_detectoris None - To prevent error when creating result csv without identity data, I also comment other three parts
line 485-487
feat_identitiesvariable creation line 506feat_identitiesFex pd concatenation line 692identity_columns = FEAT_IDENTITY_COLUMNS[1:]assignment
And the output does not contain 512 identity columns
Thank you for this neat solution @pywugate! I didnt move further with it, but I am happy to know of your solution should I come back to it