KnowUnDo icon indicating copy to clipboard operation
KnowUnDo copied to clipboard

Out of memory

Open moro0v0 opened this issue 9 months ago • 3 comments

I am using a server with two 24GB 3090 GPUs. When I run the bash run_baselines_lora.sh script, I encounter an error indicating insufficient GPU memory. How can the code be configured to support dual GPUs, or if it cannot support dual GPUs, what can I do to get it running on a single 3090?

moro0v0 avatar Feb 18 '25 12:02 moro0v0

Hi, thank you for your interest in our work!

Our experiments are conducted using the A100, so the dtype for the LLM can be set to bfloat16. With this setting, the required GPU memory for the experiment is approximately 23GB (LLM requires about 14GB, the VRAM for LoRA gradients and optimizers is negligible, and activations take up about 8GB). However, the 3090 does not support bfloat16, so the memory requirements will be significantly higher than 23GB (just loading the LLM requires about 24GB).

As for multi-GPU unlearning, you can try setting the device_map to auto when loading the model:

base_model = AutoModelForCausalLM.from_pretrained(model_args.model_id, device_map='auto')

We will support multi-GPU unlearning as soon as we can.

tbozhong avatar Feb 19 '25 07:02 tbozhong

Hi, thank you for your interest in our work!

Our experiments are conducted using the A100, so the dtype for the LLM can be set to bfloat16. With this setting, the required GPU memory for the experiment is approximately 23GB (LLM requires about 14GB, the VRAM for LoRA gradients and optimizers is negligible, and activations take up about 8GB). However, the 3090 does not support bfloat16, so the memory requirements will be significantly higher than 23GB (just loading the LLM requires about 24GB).

As for multi-GPU unlearning, you can try setting the device_map to auto when loading the model:

base_model = AutoModelForCausalLM.from_pretrained(model_args.model_id, device_map='auto') We will support multi-GPU unlearning as soon as we can. 你是中国人吗,可以加一下微信了解一下嘛,我的qq邮箱是[email protected]

moro0v0 avatar Feb 19 '25 07:02 moro0v0

I do need help.brother!!!!!!!!

moro0v0 avatar Feb 19 '25 07:02 moro0v0