Hardware barely being used?
Just downloaded it today to try and upscale some IRL videos, only to see that none of my hardware is being used. CPU <2% usage, GPU <1% usage. Something is clearly wrong.
Using real-esrgan plus, which I know is supposed to be much slower than the anime conversion models, but I don't think the program got the memo that it's allowed to use my PC.
What GPU do you have? Did you select the right GPU when you were creating the tasks?
Also, can you screenshot your task manager on the CPU and GPU page?
PC specs (abridged): DxDiag.txt
Pic of program and task manager usage:
My 2080 was the default selection by the program, so it should be using it.
The GPU fans are acting as if the GPU is experiencing load, but the processing speed of the program is next to none, nor does task manager seem to notice anything being processed.
For me, my CPU has its maximum at 50%, but most of the time around 20-30% and my GPU is only at 4%, so I think my GPU doesn't really work efficient. I don't know if it's the NCNN Framework which is not efficient with my RTX 4070.
I think there's the chance that your video size has exceeded your VRAM limit, which will cause the memory to spill into your system RAM, and that'll cause the processing to be a lot slower. Check your VRAM usage.
VRAM usage and CPU usage are negligible across the board.
Tested a 19MB file and a 200+MB file, and results were identical.
The problem is that only ONE thread of the GPU is being used (GPU engine, core, thread, whatever the fuck it's all the same). Poor little guy is doing ALL the work with no friends :C
Something else of note is that frame interpolation utilizes the CPU correctly (all threads processing), but the GPU has the same problem as the one above. Only GPU thread 11 is being used while the rest of the GPU is idle.
@smgamermat77 I can't yet reproduce this issue. Right now multi-threading hasn't been implemented yet in 6.x, but it shouldn't have too low of utilization rate. In Afterburner, can you see GPU clock/power consumption going up?
Also, have you tried to see what happens with Real-CUGAN or libplacebo?
I think the ncnn model has a lower system load compared to the PyTorch model. On an Nvidia GPU, it seems to utilize the Compute_1 unit. When running video2x, the Compute_1 usage is around 80–99%, as shown in the image below.
In my environment (RTX 3060), when testing video conversion via the CLI with a 1024x576 video:
realesrgan-plus runs at approximately 0.34 FPS
realesr-generalv3 runs at approximately 5.3 FPS
realesr-animevideov3 runs at approximately 9.5 FPS
It’s clear that realesrgan-plus runs extremely slowly. The low CPU usage is also due to the system waiting for the model to execute. Currently, version 6.4.0 does not support generalv3, but you can rename the model file to realesrgan-plus-x4 to test it. For non-anime real-life videos, realesr-generalv3 is recommended as it delivers better results than animevideov3.
@avan06 your load looks normal. realesrgan-plus is a very "heavy" model, so yeah, the CPU is just waiting for the GPU to finish inferencing. Generalv3 is available on HEAD, not yet released.
@smgamermat77 Right now multi-threading hasn't been implemented yet in 6.x
Could've just said that and skipped the entire conversation. So this software isn't usable aside from anime.
@smgamermat77 That's quite not the case, if your bottleneck is GPU like with realesrgan-plus, then having multi-threading or not wouldn't make much difference. Regardless of having multi-threading or not, you should still see some GPU utilization.