Vitus Benson

Results 8 comments of Vitus Benson

> Don't use an in-place noise process with scalars. You need to use an out-of-place noise process with immutable `u0` objects. Thank you, this solved the `fill!` issues, still another...

Updated the Issue, appreciate any help from more advanced Julia users :-)

Below comes the output.. I found a way to avoid the problem: If I set the length of the grid to an even number, e.g. `xg = yg = zg...

Thanks all, indeed it seems to be the solver `SimpleATsit5` that introduces the issue. For instance ```julia ds = ContinuousDynamicalSystem(f, [-1., -1., -1.], [0.], f_jac) xg = yg = zg...

Actually I am not so sure about that. With my original code: ``` julia> attractors[19] 3-dimensional Dataset{Float64} with 1 points -1.0 -1.0 1.0 julia> basins[40,40,52] -1 julia> tr = trajectory(ds,...

I did play around a little bit with the parameters, but it never changed. ` mx_chk_lost = 10000` gives the same results.

> The reason being that the logger connector [moves all intermediate results to CPU](https://github.com/Lightning-AI/pytorch-lightning/blob/master/src/lightning/pytorch/trainer/connectors/logger_connector/logger_connector.py#L259) on teardown. So on the second call to `trainer.validate`, the helper-state (e.g. [cumulated_batch_size](https://github.com/Lightning-AI/pytorch-lightning/blob/master/src/lightning/pytorch/trainer/connectors/logger_connector/result.py#L203)) of the cached...

Ahh cool! Thanks for the clarification:) I don't know if you use the Pytorch lightning script or not, but there is a bug: https://github.com/openclimatefix/graph_weather/blob/3cc040aed704940a60a4e3b8c730bf61663a43f9/train/pl_graph_weather.py#L280 This makes the edge processor very...