Pytorch-tensorboard_tutorial icon indicating copy to clipboard operation
Pytorch-tensorboard_tutorial copied to clipboard

[plugin_event_accumulator.py:323] Found more than one graph event per run, or there was a metagraph containing a graph_def, as well as one or more graph events. Overwriting the graph with the newest event.

Open BrandonLiang opened this issue 4 years ago • 1 comments

Hi,

I use writer.add_scalar() in a similar fashion as your tutorial, where I add training loss & accuracy for each iteration in each epoch. I am seeing these std output for launching tensorboard and I wonder if you have seen these:

Screen Shot 2020-11-01 at 6 22 23 PM

Frankly, these outputs don't really matter to the actual tensorboard run, but I would prefer to suppress these warnings on command line altogether (they come out every second, which is basically flooding my terminal).

I cam across this post: https://stackoverflow.com/questions/45890560/tensorflow-found-more-than-one-graph-event-per-run. However, there is only a single log file for tensorboard, which was the main concern in that post.

Thanks,

BrandonLiang avatar Nov 01 '20 23:11 BrandonLiang

When you run tensorboard, then event file('events.out....') file is generated.

Did you remove or relocate previous event file?

If you didn't, please run tensorboard again after remove or relocation

Or you can make a subdirectory for each event file to draw directory-wise plot

For example,

scalar/ ----- test1/-----eventfile1
          |-- test2/-----eventfile2
          |-- test3/-----eventfile3

HyoungsungKim avatar Nov 02 '20 05:11 HyoungsungKim