transformers icon indicating copy to clipboard operation
transformers copied to clipboard

fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP is not working with the Trainer

Open eljandoubi opened this issue 1 year ago • 5 comments

System Info

image acc_cfg.yml:

compute_environment: LOCAL_MACHINE debug: false distributed_type: FSDP downcast_bf16: 'no' enable_cpu_affinity: true fsdp_config: fsdp_activation_checkpointing: true fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP fsdp_backward_prefetch: NO_PREFETCH fsdp_cpu_ram_efficient_loading: true fsdp_forward_prefetch: true fsdp_offload_params: true fsdp_sharding_strategy: FULL_SHARD fsdp_state_dict_type: SHARDED_STATE_DICT fsdp_sync_module_states: true fsdp_use_orig_params: true machine_rank: 0 main_process_ip: 0.0.0.0 main_process_port: 0 main_training_function: main mixed_precision: bf16 num_machines: 3 num_processes: 24 rdzv_backend: etcd-v2 same_network: false tpu_env: [] tpu_use_cluster: false tpu_use_sudo: false use_cpu: false

Who can help?

No response

Information

  • [ ] The official example scripts
  • [ ] My own modified scripts

Tasks

  • [ ] An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • [ ] My own task or dataset (give details below)

Reproduction

accelerate launch --config_file acc_cfg.yml train.py $TRAINING_ARGS the train.py is any training script that train using transformers.Trainer $TRAINING_ARGS are the TrainingArguments plus some path to data fdsp_trans

Expected behavior

Train Paligemma model with FSDP and have PaliGemmaMultiModalProjector wrapped.

eljandoubi avatar Oct 12 '24 20:10 eljandoubi