temporal_fusion_transformer_pytorch
temporal_fusion_transformer_pytorch copied to clipboard
Got an error while running trainer = pl.Trainer ?
when I run:
trainer = pl.Trainer(max_nb_epochs = tft.num_epochs,
gpus = 1,
track_grad_norm = 2,
gradient_clip_val = tft.max_gradient_norm,
early_stop_callback = early_stop_callback,
#train_percent_check = 0.01,
#val_percent_check = 0.01,
#test_percent_check = 0.01,
overfit_pct=0.01,
#fast_dev_run=True,
profiler=True,
#print_nan_grads = True,
#distributed_backend='dp'
)
trainer.fit(tft)
in training_tft.ipynb, it raise error below:
TypeError Traceback (most recent call last)
TypeError: init() got an unexpected keyword argument 'max_nb_epochs'
Would appreciate a lot if anyone can help with this bug
Hello, this code uses an older version of pytorch lightning, which uses the attribute max_nb_epochs. I recommend you visit pytorch forecasting, it took the temporal fusion transformer from this repo and has an updated implementation. https://github.com/jdb78/pytorch-forecasting/tree/master
@Xanyv use max_epochs instead, also after solving this you'll get an error on 'early_stop_callback' there, just use callbacks
max_nb_epochs-->max_epochs early_stop_callback-->callbacks profiler=True-->profiler="advance"