tensorboard-aggregator icon indicating copy to clipboard operation
tensorboard-aggregator copied to clipboard

Urgent Help Aggregating

Open Ademord opened this issue 2 years ago • 1 comments

I am trying to use this aggregator, but i am getting the error

  File "aggregator.py", line 31, in extract
    assert len(set(all_keys)) == 1, "All runs need to have the same scalar keys. There are mismatches in {}".format(all_keys)
AssertionError: All runs need to have the same scalar keys. There are mismatches in []

my dir structure looks like this

ls ../results/
inference.run63++100  inference.run64  run61  run62  run63  run63++025  run63++050  run63++075  run63++100  run64  run64-inference  run64-nospeed  run64+vision  run65  run65-pure  run66  run67  run68  run68++050  run68++050-inference

I am trying to aggregate all my runs (which have many variables) into a single csv so I don't have to download each one manually and then aggregate it manually... picture example attached: image

Ademord avatar Dec 14 '21 21:12 Ademord

So i managed to get a script that merges the manually downloaded CSVs into a single csv, for anyone who would need something like this:

import pandas as pd
import glob
from run_name_getter import *

import matplotlib.pyplot as plt

files = glob.glob("movingForward/*.csv")

if len(files) == 0: raise BaseException("no files")

step_column = pd.read_csv(files[0], usecols=['Step'])

all_csvs = [pd.read_csv(p, usecols=['Value']) for p in files]

for i in range(len(all_csvs)):
    csv = all_csvs[i]
    run_name = get_run_name(files[i])
    csv.columns = [run_name]

metric_name = get_metric_name(files[0])

all_csvs = [step_column] + all_csvs

df = pd.concat(all_csvs, axis=1)

df.plot(x="Step", alpha=0.5, title=metric_name)
plt.show()

the only downside now is I have to manually download each CSV for each run for each metric.

Ademord avatar Dec 18 '21 10:12 Ademord

I'm no longer actively working on this tool and am not sure if this issue is still relevant. Going to close it for now. We can reopen if it's still an issue for you.

Spenhouet avatar Aug 24 '23 17:08 Spenhouet