Shorten duration of `test_gradients`
test_gradients takes a long time to finish running. I propose we look into making the test take a smaller amount of time to finish running, or switch over to using a modulus that doesn't implement a custom backwards.
Would be nice. Maybe there are some settings there to tweak.
But yes, if we can use a stock modulus, that would be better… Maybe now that Torch has complex numbers that should be possible?
@janden are we wanting to keep supporting torch 1.7? One thing we could do in that case to make the sqrt operation differentiable is adding a small epsilon.
@janden are we wanting to keep supporting torch 1.7?
I'm not sure. That's a separate discussion.
One thing we could do in that case to make the sqrt operation differentiable is adding a small epsilon.
That would break all the tests, so it'd be nice to avoid that…
Another option here is to test our modulus separately (using test_gradients) and only verify that we can run backward on the output of scattering (i.e., not test the gradient of the full transform).
Given that we merged #873 I believe this discussion is no longer needed.
We no longer support the earlier versions of torch, and thus we no longer use the long version of test_gradients.