char-rnn icon indicating copy to clipboard operation
char-rnn copied to clipboard

Create web visualization tools

Open EricZeiberg opened this issue 9 years ago • 10 comments

This PR creates a visualization tool (similar to the one in NeuralTalk) that allows someone to track the training of the modal in the browser, by visiting the monitor.html page. It works by outputting various metrics to the data.txt file and train.txt file in the web_utils folder via Lua's I/O. The JS on the monitor.html page then reads those 2 files every 3 seconds and updates various graphs and progress bars to show how the training score is lowering, and completion of the epoch / iteration.

Visualization tools are disabled by default, and can be enabled by adding the -visualize flag to the train.lua file when running it.

Heres a link to a screenshot of the page: http://puu.sh/jycUj/254db9430c.png (The black background of the graph was removed after this screenshot was taken, but otherwise, this picture is accurate)

Also, theres alot of commits in this PR, so its best to view it on combined mode, instead of viewing each commit separately.

Thanks, Eric

EricZeiberg avatar Aug 12 '15 00:08 EricZeiberg

So after some testing, the page tends to crash if left loaded for a period of time. Working on averaging out data points to minimize memory usage right now.

Edit: Actually just going to change it to record every other point, thus cutting data in half.

EricZeiberg avatar Aug 12 '15 03:08 EricZeiberg

Thanks! This looks great and I'd be happy to merge something like this when the kinks are ironed out.

karpathy avatar Aug 12 '15 10:08 karpathy

Alright, after upgrading to CanvasJS, the graph can now handle hundreds of thousands of data points at once, with little optimization. Feel free to merge, I'll let you know if I find any other bugs.

Oh, and heres an updated screenshot: http://puu.sh/jz8d9/89901ee9f6.png

EricZeiberg avatar Aug 12 '15 19:08 EricZeiberg

Is it easy to also include, e.g. validation loss? looking at it in context of training loss is very useful usually. We can expand on this in the future I suppose. I will look through this PR in detail tomorrow, it's already near midnight here. Thanks!

karpathy avatar Aug 12 '15 22:08 karpathy

Its easy to include anything, really. Would you like just some text displaying the most recent validation loss value, or a graph?

EricZeiberg avatar Aug 12 '15 23:08 EricZeiberg

Nice work! Is it possible also to include to the monitoring page next graphs?

  • training/validation accuracy on the one graph (to check overfitting/underfitting)
  • grad/param norm These graphs will be useful to tune hyperparameters.

alexkruegger avatar Aug 15 '15 15:08 alexkruegger

Ok I had a closer look at the code and while I am onboard with the general idea of including web-based visualization of the training progress, I am hesitant to merge this particular version of it.

Going down the path of writing arbitrary things to different lines of a text files doesn't usually lead anywhere nice. Fast hacks and conveniences usually down the line lead to frustration.

I think a clean solution would compile reports in lua, which would then be exported as JSON. The JSON would contain a summarized full history of the training progress (e.g. current options, train/val loss as a list, example samples from current model, etc.). The web interface would then read these files and draw them. The web interface should have little state by itself - e.g. refreshing the page should give approximately the same view.

We can leave this pull request open so that anyone who stumbles by this and wants to use it can merge in your code, but I'll leave it out of master.

karpathy avatar Aug 15 '15 17:08 karpathy

Yep, that's fine. I'll work on it when I get some free time. On Aug 15, 2015 11:53 AM, "alexkruegger" [email protected] wrote:

Nice work! Is it possible also to include to the monitoring page next graphs?

  • training/validation accuracy on the one graph (to check overfitting/underfitting)
  • grad/param norm These graphs will be useful to tune hyperparameters.

— Reply to this email directly or view it on GitHub https://github.com/karpathy/char-rnn/pull/82#issuecomment-131395670.

EricZeiberg avatar Aug 15 '15 17:08 EricZeiberg

Just wanted to say I'm working on this now (in case someone else was thinking of starting on this).

whackashoe avatar Dec 15 '15 08:12 whackashoe

+1

jtoy avatar Jan 30 '16 21:01 jtoy