zero_nlp
zero_nlp copied to clipboard
救命!!ChatGlm-v2-6b_Lora该怎么设置epoch??
export CUDA_VISIBLE_DEVICES=0
python main.py
--do_train
--train_file D:/LLM/yuanzhoulvpi/AdvertiseGen/train.json
--validation_file D:/LLM/yuanzhoulvpi/AdvertiseGen/dev.json
--preprocessing_num_workers 10
--prompt_column content
--response_column summary
--overwrite_cache
--model_name_or_path chatglm2-6b_model
--output_dir output/adgen-chatglm2-6b-lora_version
--overwrite_output_dir
--max_source_length 64
--max_target_length 128
--per_device_train_batch_size 1
--per_device_eval_batch_size 1
--gradient_accumulation_steps 16
--predict_with_generate
--max_steps 3000
--logging_steps 10
--save_steps 100
--learning_rate 2e-5
--lora_r 32
--model_parallel_mode True
我看这里也没有epoch轮数,难道Lora只能训练一次?