内存不足
不好意思请问下 我用4张32G v100并行 但程序只用到了其中1张卡,导致内存不足,如何能够多卡并行推理呢
这是我的命令 CUDA_VISIBLE_DEVICES=0,1,2,3 torchrun --standalone --nproc_per_node 1 scripts/inference.py configs/opensora/inference/64x512x512.py --ckpt-path ./OpenSora-v1-HQ-16x512x512.pth
same question: I have 2 nvidia-4090 and I set "CUDA_VISIBLE_DEVICES=0,1 torchrun ...", but it seems only the first 4090 is used, and I goto mem-error for 16x512x512 (16x256x256 is fine), if there is any solution for running inference on multiple GPUs please let us know Thanks!
修改--nproc_per_node参数
This issue is stale because it has been open for 7 days with no activity.
This issue was closed because it has been inactive for 7 days since being marked as stale.