QualityScaler icon indicating copy to clipboard operation
QualityScaler copied to clipboard

Error upscaling video - CPU used instead of GPU

Open catclaw opened this issue 9 months ago • 3 comments

Hi again!

I tried to upscale a 5 sec. video for testing, but both tries failed. Instead of using GPU, my both CPUs (dual Xeon Gold) were used to try and upscale the video - and nothing happened? The load of both CPUs were ~85% (Python 97%). For the first run, I changed the GPU setting to use GPU 2 (my second GPU) and set VRAM to 12 (24 GB VRAM available). For the second run, I changed the GPU setting to Auto, and set VRAM to 12, but nothing happened besides both CPUs going from 41C to 89C despite I have water cooling! Here is the output in the VSCode terminal:

PS F:\QualityScaler-main\QualityScaler-main> & f:/QualityScaler-main/QualityScaler-main/.venv/Scripts/python.exe f:/QualityScaler-main/QualityScaler-main/QualityScaler.py [QualityScaler] ffmpeg.exe found [QualityScaler] Preference file does not exist, using default coded value

Uploaded files: 1 => Supported files: 1 ================================================== Starting upscale: Files to upscale: 1 Output path: F:/AI bloopers/Upscaled Selected AI model: BSRGANx4 Selected GPU: GPU 2 AI multithreading: 8 Blending factor: 0.7 Selected image output extension: .png Selected video output extension: .mp4 Selected video output codec: h264_nvenc Tiles resolution for selected GPU VRAM: 719x719px Input resize factor: 50% Output resize factor: 100% Cpu number: 28 Save frames: True ================================================== [QualityScaler] ffmpeg.exe found [QualityScaler] Preference file does not exist, using default coded value Loading AI model

  1. Extracting video frames Frames supported simultaneously by GPU: 8
  2. Upscaling video
  3. Upscaling video (8 threads) F:\QualityScaler-main\QualityScaler-main.venv\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py:115: UserWarning: Specified provider 'DmlExecutionProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider' warnings.warn( Stop ==================================================

Starting upscale: Files to upscale: 1 Output path: F:/AI bloopers/Upscaled Selected AI model: BSRGANx4 Selected GPU: Auto AI multithreading: 4 Blending factor: 0.7 Selected image output extension: .png Selected video output extension: .mp4 Selected video output codec: h264_nvenc Tiles resolution for selected GPU VRAM: 719x719px Input resize factor: 75% Output resize factor: 100% Cpu number: 28 Save frames: True ================================================== [QualityScaler] ffmpeg.exe found [QualityScaler] Preference file does not exist, using default coded value Loading AI model

  1. Extracting video frames Frames supported simultaneously by GPU: 3
  2. Upscaling video
  3. Upscaling video (3 threads) F:\QualityScaler-main\QualityScaler-main.venv\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py:115: UserWarning: Specified provider 'DmlExecutionProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider' warnings.warn( Stop

catclaw avatar Feb 14 '25 22:02 catclaw

Hi my friend,

your gpu does not have support for DMLExecutionProvider. Are you on an Azure server?

Djdefrag avatar Feb 17 '25 10:02 Djdefrag

Hi my friend,

your gpu does not have support for DMLExecutionProvider. Are you on an Azure server?

Hello, my friend! I hope you're doing well, and I'm so sorry for a very late reply - I'm studying journalism and I'm in the middle of mid-term exams...

No, I'm running Windows 11 Enterprise on a local workstation. I'm not sure if I'm following? I'm running dual RTX 3060 GPUs on this machine, and DMLExecutionProvider should work fine? (pip install onnxruntime-gpu) I also tried to run QualityScaler on a second machine with a RTX 3090 and A100 setup, but the results is the same?

I tried this code:

import onnxruntime as ort
print("ONNX Runtime version:", ort.__version__)
print("Available providers:", ort.get_available_providers())

And the output is:

ONNX Runtime version: 1.21.0 Available providers: ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider']

catclaw avatar Mar 24 '25 18:03 catclaw

hi my friend @catclaw

i think you installed the wrong onnxruntime, install only the dependencies in requirements.txt :)

Djdefrag avatar Apr 17 '25 06:04 Djdefrag