Minor UI changes for the release
- [x] On the training page, add hover explanations for "round", "batch" and "epoch" and their relationships with each other.
- [x] Display a banner saying that Disco is currently in a demo state
- [x] Add a link to a feedback form
- [x] Add user feedback when downloading a model
- [ ] Add LLM training ETA
- [x] Clarify the use of "epochs" for the language modeling task (which aren't really epochs)
- [x] Display LLM inference message that currently only the CLI inference is available
taken from comment on prerelease PR
The epoch count is reset whenever the round is incremented, is it expected? When I train wikitext I get the following sequence: round 0 - epoch 0 - batch 0 round 0 - epoch 0 - batch 1 [...] round 0 - epoch 0 - batch 5 round 0 - epoch 1 - batch 0 [...] round 0 - epoch 1 - batch 5 round 0 - epoch 2 - batch 0 round 1 - epoch 0 - batch 0 round 1 - epoch 0 - batch 1
Shouldn't the epoch continue incrementing rather than being reset? (Clearly there needs to be more explanations in the UI regarding rounds epochs and batches #691).
I was thinking of a ~descending relation, translated in words as "each round, run X epoches. each epoch, run Y batches.". so every round/epoch restart its lower ~layer.
Hm as a user I would like to know how many epochs the model has been training so far (without having to do a multiplication and remember how rounds translate into epochs). Is it even useful to show the round number actually? Could we rather only show (like with an animation or something) when communication is happening but not necessarily show the round number?