Darley Barreto

Results 30 comments of Darley Barreto

I was trying to make a version of this to run in parallel. But I couldn't figure out a easy way without using message passing, Mutex or Arc. I wish...

@paugier what you think? Could this be merged as is?

@paugier I just added a parallel version for Rust. On my machine I got the following ``` time ./nbabel-nbody/target/release/nbabel ../data/input2k 2 Running Parallel SIMD version Final dE/E = -0.44307369910184735 real...

The instances of my dataset are multi-dimensional, each feature is a time series sampled at the same rate (at the same time). Given the rate of the sampling, I know...

Sorry, there was two errors, it should be `N x Time_idx x (N_features + N_masks)`, where `Time_idx` is the time index of size `Max_series_size`.

Using `autograd.detect_anomaly()` around `loss.backward()` I see ```python RuntimeError: Function 'OdeintAdjointMethodBackward' returned nan values in its 2th output. Segmentation fault (core dumped) ``` So it seems to be during the backward...

OK, thanks to clarify that! If I remember correctly, I have to add some hooks in backward of `OdeintAdjointMethod`, because the debugger can't reach inside. I will add infos here...

After a couple days debugging, I found where the `nan`s come from. Testing your [example](https://github.com/patrick-kidger/torchcde/blob/master/example/irregular_data.py) and adding this simple print in [here](https://github.com/rtqichen/torchdiffeq/blob/master/torchdiffeq/_impl/rk_common.py#L189-L196): ```python def _advance(self, next_t): """Interpolate through the next...

You were right, I replaced all activations with `tanh`. I also added [here](https://github.com/rtqichen/torchdiffeq/blob/master/torchdiffeq/_impl/solvers.py#L24-L31): ```python def integrate(self, t): solution = torch.empty(len(t), *self.y0.shape, dtype=self.y0.dtype, device=self.y0.device) solution[0] = self.y0 t = t.to(self.dtype) self._before_integrate(t)...

> Are you normalising your input data? (You should be.) I wasn't 😞 . That did solve my problem! Thank you so much for you attention and help with this!