CodeT5 icon indicating copy to clipboard operation
CodeT5 copied to clipboard

Failed to evaluate codet5p-2b on multiple GPU cards

Open Cxm211 opened this issue 2 years ago • 1 comments

Hi, I tried to evaluate codet5p-2b. I loaded the model from huggingface and I got an error saying CUDA out of memory, then I tried to load the model into multiple GPU cards by adding device_map = 'auto' when load the model. But I got another error: CodeT5pEncoderDecoderModel does not support device_map='auto' yet. The same issue happens when I loaded my own finetuend codet5p-2b models.

Cxm211 avatar Jul 19 '23 08:07 Cxm211

Hi there, we've updated the model class and this issue should be fixed now.

yuewang-cuhk avatar Aug 04 '23 02:08 yuewang-cuhk