BLIVA icon indicating copy to clipboard operation
BLIVA copied to clipboard

Cuda Out of Memory!

Open Ethantequila opened this issue 1 year ago • 2 comments

Hi I try to run a demo(BLIVA_Vicuna 7B) in my local machine(V100, 16GB), It comes with a OOM error: torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 86.00 MiB (GPU 0; 15.77 GiB total capacity; 12.18 GiB already allocated; 54.88 MiB free; 12.38 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

So how much is the minimum amount of GPU memory required ? Or is there any way to reduce GPU usage?

Thanks a lot!

Ethantequila avatar Nov 01 '23 06:11 Ethantequila

Thank you for your interest in our work. Short answer is it needs 20G memory. Longer answer is here: https://github.com/mlpc-ucsd/BLIVA/issues/3

gordonhu608 avatar Nov 01 '23 06:11 gordonhu608

Thank you for your interest in our work. Short answer is it needs 20G memory. Longer answer is here: #3

Thanks for your quick reply! I am wondering how much GPU memory needed for run BLIVA_Vicuna 7B with INT8?

Ethantequila avatar Nov 01 '23 07:11 Ethantequila