Deep-Live-Cam icon indicating copy to clipboard operation
Deep-Live-Cam copied to clipboard

The program only uses the CPU for calculations on mac computers

Open teethandnail opened this issue 1 year ago • 15 comments

截屏2024-08-28 17 07 30 How can I enable gpu on my mac ?

teethandnail avatar Aug 28 '24 09:08 teethandnail

image

I installed the above libraries.

teethandnail avatar Aug 28 '24 09:08 teethandnail

I'm facing the same issue on my m3 max device too. I have following the steps and successfully run with python run.py --execution-provider coreml. But the app only use my cpu and thus very laggy

emiyalee1005 avatar Aug 28 '24 16:08 emiyalee1005

+1, is there a way to get better support on M3 chip? It's very laggy on my M3 macbook and I believe the GPU is not fully used.

DaoDaoNoCode avatar Aug 28 '24 16:08 DaoDaoNoCode

Same question i came here to ask, its attrocious on mac, like severe lag, also it always uses continuity cam instead of the builtin mac cam, don't see a flag to disable it.

cchance27 avatar Aug 28 '24 17:08 cchance27

I'm facing the same issue on my m3 max device too. I have following the steps and successfully run with python run.py --execution-provider coreml. But the app only use my cpu and thus very laggy

Here is the monitor data image

emiyalee1005 avatar Aug 28 '24 17:08 emiyalee1005

Same issue for me. M2 Max Screenshot 2024-08-28 at 9 34 51 PM

cdrake118 avatar Aug 29 '24 01:08 cdrake118

Same question i came here to ask, its attrocious on mac, like severe lag, also it always uses continuity cam instead of the builtin mac cam, don't see a flag to disable it.

I changed the default camera by editing the ui.py file - look for camera = cv2.VideoCapture(0) and change the number. 0 was my iPhone, 1 was my monitor, I assume 2 will be my macbook built in one. Your numeration will vary.

budda avatar Sep 03 '24 01:09 budda

image I installed the above libraries.

How did you manage to get the onnxruntime-silicon package installed ?! I just get the error:

ERROR: Could not find a version that satisfies the requirement onnxruntime-silicon (from versions: none)
ERROR: No matching distribution found for onnxruntime-silicon

budda avatar Sep 03 '24 01:09 budda

image I installed the above libraries.

How did you manage to get the onnxruntime-silicon package installed ?! I just get the error:

ERROR: Could not find a version that satisfies the requirement onnxruntime-silicon (from versions: none)
ERROR: No matching distribution found for onnxruntime-silicon

wrong python version?

defertoexpertise avatar Sep 04 '24 17:09 defertoexpertise

Using a 32-bit ONNX model increases the inference speed to 15fps, primarily because onnxruntime doesn't yet fully support Apple's M-series chips.

  1. You can refer to this repository: Deep-Live-Cam.
  2. I am currently converting an ONNX model to a PyTorch model and utilizing MPS for acceleration, but I have encountered some issues during inference and am working to resolve them.

If you are interested, feel free to contact me via email: [email protected].

solstice-gao avatar Sep 05 '24 10:09 solstice-gao

image I installed the above libraries.

How did you manage to get the onnxruntime-silicon package installed ?! I just get the error:

ERROR: Could not find a version that satisfies the requirement onnxruntime-silicon (from versions: none)
ERROR: No matching distribution found for onnxruntime-silicon

wrong python version?

yeah it's newer than 3.10 I tried to downgrade python but didn't seem to "work" for me. I'm new to the world of Python though...

It looks like the package has been renamed to onnxruntime-arm64 now?

budda avatar Sep 05 '24 13:09 budda

image I installed the above libraries.

How did you manage to get the onnxruntime-silicon package installed ?! I just get the error:

ERROR: Could not find a version that satisfies the requirement onnxruntime-silicon (from versions: none)
ERROR: No matching distribution found for onnxruntime-silicon

I installed miniconda, then used miniconda to create a python virtual environment, installed python3.10.10 in the virtual environment, and then executed the commands according to the author's instructions.

teethandnail avatar Sep 05 '24 13:09 teethandnail

Perhaps a silly question... but are you on an Arm based mac or an older intel mac?

defertoexpertise avatar Sep 05 '24 13:09 defertoexpertise

Perhaps a silly question... but are you on an Arm based mac or an older intel mac?

yeah, M2 Max here.

budda avatar Sep 05 '24 17:09 budda

Using a 32-bit ONNX model increases the inference speed to 15fps, primarily because onnxruntime doesn't yet fully support Apple's M-series chips.

  1. You can refer to this repository: Deep-Live-Cam.
  2. I am currently converting an ONNX model to a PyTorch model and utilizing MPS for acceleration, but I have encountered some issues during inference and am working to resolve them.

If you are interested, feel free to contact me via email: [email protected].

Did you ever successfully finish your conversion @solstice-gao ?

ChristianWeyer avatar Sep 26 '24 16:09 ChristianWeyer