Anthony Blaom, PhD
Anthony Blaom, PhD
This PR is above average in complexity. This means a review is particularly important but it's also going to be more work than usual. @pat-alt Do you have any interest...
@pat-alt Sorry to re-ping, but I'm not sure who else to ask for a review here. @tiemvanderdeure Would you consider reviewing? If possible, hoping for a merge in the next...
Let's see if @pat-alt is able to finds some time.
Thanks @pat-alt for your review. Much appreciated. I've made a few tweaks in response to the comments.
@pat-alt How are we doing? Happy with the changes?
Thanks @pat-alt for your review. 🙏🏾
Thanks. Perhaps we can now resolve https://github.com/FluxML/MLJFlux.jl/issues/162! I'm happy to make MetalHead as a dependency of MLJFlux. PR welcome.
Closed in favour of #205
Good catch @rdavis120, thanks. That [doc string](https://github.com/FluxML/MLJFlux.jl/blob/4fa632de257cfde9dc0b01693b1f4270a4ecc405/src/types.jl#L697C51-L697C51) indeed requires updating.
Thanks @MathNog for reporting. I've not tried to reproduce, but your analysis sounds reasonable. (Current tests do include changing batch size for some non-recurrent networks.) Each time `MLJModelnterface.fit` is called,...