transformers_tasks
transformers_tasks copied to clipboard
LLM的finetune需要多大的显存啊?为什么我batch__size调到了1,24g的显存还是不够。。。。。。。
报错:OutOfMemoryError: CUDA out of memory. Tried to allocate 128.00 MiB (GPU 0; 24.00 GiB total capacity; 23.13 GiB already allocated; 0 bytes free; 23.15 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Hi,您可以尝试减小 这里 的 max_source_seq_len
和 max_target_seq_len
的长度。
我是4090显卡,24g显存,长度设置为50都报错