Bump version of Python, torch and pytorch-lightning
I would like to extend the compatibility with Python 3.11, torch 2.1, and pytorch-lightning 2.1.3.
One fix was easy to make (e.g. change training_epoch_end into on_training_epoch_end).
Another fix was about errors appearing for a mismatch of data type in the parameters of masked_scatter and scatter_add. They are probably related to these open issues: #81876, #115821. For now, my solution is not very elegant, which is to force the data to be float32 the two times time it's passed to the functions.
Finally, this PR is still a draft because I cannot understand another error popping up:
RuntimeError: Early stopping conditioned on metric `frobenius_norm_change` which is not available. Pass in or modify your `EarlyStopping` callback to use any of the following: `inertia`
I cannot find any definition of frobenius_norm_change in the code, nor in Pytorch or Pytorch Lightning, so I don't know how to solve the error.