Flux.jl
Flux.jl copied to clipboard
Examples for `skip` and `stop`
skip claims that if used in the callback passed to train!, the update! step will not happen. This isn't true, it's called too late for that. Its tests don't try to test that, instead they test that if used in the loss function, it doesn't evaluate the rest of the loss.
So right now this PR adds an example showing what it does do. Although perhaps better to simply remove it? Github's search can't find a single use of it in the wild. Originally added in #1232.
Also adds an example for stop, which does work as described.
Codecov Report
Merging #1913 (0f4da2b) into master (b6dbefb) will not change coverage. The diff coverage is
n/a.
@@ Coverage Diff @@
## master #1913 +/- ##
=======================================
Coverage 86.64% 86.64%
=======================================
Files 18 18
Lines 1445 1445
=======================================
Hits 1252 1252
Misses 193 193
| Impacted Files | Coverage Δ | |
|---|---|---|
| src/optimise/train.jl | 88.57% <ø> (ø) |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact),ø = not affected,? = missing dataPowered by Codecov. Last update b6dbefb...0f4da2b. Read the comment docs.
I have definitely seen it on code posted to Discourse, so removal might be too disruptive. The "proper" design would be to allow users to specify where in the training loop each callback runs, but it's a slippery slope from there to something like FluxTraining.jl.