Anthony Blaom, PhD

Results 815 comments of Anthony Blaom, PhD

On dev, a learner can implement `transform(learner, X)`, as shorthand for `transform(fit(learner, X), X)` or `transform(fit(learner), X)` for "static models".

Related discussion: https://github.com/alan-turing-institute/MLJ.jl/issues/60

Thanks @jeremiedb for the thoughtful commentary. I suppose that in any kind of "update", we provide one or more of the following: 1. Changes to hyper-parameters (e.g., an increase in...

Okay, 1 as above doesn't allow for adding iterations with a new learning rate for the new iterations. Perhaps: **1 only**: For a single (or no) hyperparameter replacement, the update...

https://juliaai.github.io/LearnAPI.jl/dev/fit_update/#Updating-2

For an example of `update_observations` see [here](https://github.com/JuliaAI/LearnTestAPI.jl/blob/fb6a52185050e66225e78ffc16d8fa0ed59be702/src/learners/incremental_algorithms.jl#L7). For an example of `update` with internally computed out-of-sample predictions, see [here](https://github.com/JuliaAI/LearnTestAPI.jl/blob/fb6a52185050e66225e78ffc16d8fa0ed59be702/src/learners/ensembling.jl#L185)

In the abscence of further engagement on our suggestions, closing now, as version 1.0 now live.

cc @abhro Maybe you can include this.

There's a second place it occurs: https://github.com/JuliaAI/MLJ.jl/blob/18e9c9ff1208623de170c6302c033ae45c551bc4/docs/src/mlj_cheatsheet.md?plain=1#L299