lyp

Results 5 comments of lyp

prerequisite: >loss = self.criterion(logits, y) I noticed that you made a change from >self.log("train/loss", loss, on_step=False, on_epoch=True, prog_bar=False) to >self.log("train/loss", self.train_loss, on_step=False, on_epoch=True, prog_bar=True) In the ***original version***, since the...

Thanks! Indeed, from the experimental results, they are the same. So, how about `reduce_fx`? What exactly does it do?

> This is the alignment model in paper which has three single layer MLP: v, W and U, but in Tutorial 3 implementations,we cat vector _S_ and encoder output h,...

There are two files (`train.log` and `train_ddp_process_1.log`) and one folder (`.hydra`) are produced in the `ROOT_DIR`

```python lr_monitor: _target_: pytorch_lightning.callbacks.LearningRateMonitor logging_interval: null log_momentum: False ```