llama-lora-fine-tuning icon indicating copy to clipboard operation
llama-lora-fine-tuning copied to clipboard

RuntimeError:CUDA error : out of memory

Open Hzzhang-nlp opened this issue 1 year ago • 4 comments

Two 3060 graphics cards with a total memory of 24GB, why would this error still be reported? image

Hzzhang-nlp avatar Jun 27 '23 07:06 Hzzhang-nlp