训练结果日志打印出epoch为0.5,1.0,1.5,2.0
训练结果日志打印如下: "log_history": [ { "epoch": 0.5, "learning_rate": 0.0075, "loss": 8.3906, "step": 1 }, { "epoch": 1.0, "learning_rate": 0.005, "loss": 5.7188, "step": 2 }, { "epoch": 1.5, "learning_rate": 0.0025, "loss": 5.9648, "step": 3 }, { "epoch": 2.0, "learning_rate": 0.0, "loss": 4.4141, "step": 4 }, { "epoch": 2.0, "step": 4, "total_flos": 8666430308352.0, "train_loss": 6.1220703125, "train_runtime": 1372.5232, "train_samples_per_second": 0.003, "train_steps_per_second": 0.003 } ], "max_steps": 4, "num_train_epochs": 2, "total_flos": 8666430308352.0, "trial_name": null, "trial_params": null
epoch为0.5,1.0,1.5,2.0,step为4 epoch应该是整数吧?
环境:
- OS:Ubuntu 20.04
- Python:3.8
- Transformers:4.27.1
- PyTorch:2.0
- CUDA Support : true
如果按照step进行log,则会按照 n_step / len(dataloader) 的方式计算当前等价 epoch 数