ddpm-torch
ddpm-torch copied to clipboard
Minor Fix - Final Batch Size if Total Samples to use in FID Calculation is a multiple of Batch Size
Thank you so much for open sourcing this work! I think I may have found a very, very slight issue.
When using the --eval-total-size
argument, the final batch in the evaluator will be 0
if the value passed to --eval-total-size
is a multiple of the batch size, i.e. self.eval_total_size % self.eval_batch_size = 0
.
python train.py --eval --eval-total-size 512
This will cause a ValueError
later in the FID calculation since we compute the mean of an empty array.