returnn
returnn copied to clipboard
PyTorch distributed: eval distributed as well
It's not really so difficult: Just split the dataset (that's the trickiest part), and then let each worker do eval, and correctly accumulate the results.