Max Marion
Max Marion
Also getting this issue. At first I was using "limit_train_batches" in my Pytorch Lightning trainer, but I removed it and still ended up with this error. PTL runs two validation...
I solved this w/ `num_sanity_val_steps=0` in my PytorchLightning Trainer! Seems like my intuition above was correct, and after turning the sanity check off, the val set doesnt need be reset,...
lgtm! I kinda hate checking in notebooks but I do think it's better than a script in this case.