CogVideo icon indicating copy to clipboard operation
CogVideo copied to clipboard

'RuntimeError: CUDA out of memory.' when use RTX3080

Open YeautyYE opened this issue 1 year ago • 1 comments

My GPU is RTX3080, but when I use the command sudo sh ./scripts/inference_cogvideo_pipeline.sh, the following error occurs

RuntimeError: CUDA out of memory. Tried to allocate 54.00 MiB (GPU 0; 9.78 GiB total capacity; 9.53 GiB already allocated; 28.31 MiB free; 9.54 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

YeautyYE avatar Mar 11 '23 14:03 YeautyYE

Hi, It takes around 25GB GPU memory to inference with batchsize=1 (on our A100).

wenyihong avatar Apr 13 '23 03:04 wenyihong