William

Results 5 comments of William

The same issue as you, and the model to be fine-tuned is mixtral

Just feed all the data and model to the same device(e.g. cuda)

> > Try turning off the command line proxy > > I have the same issue, but the way did not work for me. I use M1 Pro 2021, and...

@tjruwase @mrwyattii @loadams I tested a fine-tuning of llama2-70B in the same environment(hardware and software), and it interrupted with the same error `AttributeError: 'NoneType' object has no attribute 'swap_folder'`. Please...

It seems that the mixtral model is loaded to your every GPU rather than partitioned to your GPU (A100) equally. ```python model = AutoModelForCausalLM.from_pretrained( model=$model_name_or_model_path, torch_dtype=torch.float16, ) ``` Try this...