progrockdiffusion
progrockdiffusion copied to clipboard
[Idea] Make sure all the various random functions respect seed value
I think right now only pytorch is using it, with maybe a few other instances sprinkled around. An effort to make sure all the various random functions are initiated with the seed value might help limit the differences when re-running an image with identical settings.
I coincidentally took a stab at expanding on this yesterday when trying to do a prompt study. Wasn't able to lock it down entirely because some of the pytorch methods are non-deterministic no matter what, but basically taking this https://github.com/lowfuel/progrockdiffusion/blob/9bc4e7d444eb6a2fb0558e02b7832d0a03c63247/prd.py#L1488-L1493
And expanding it to
if seed is not None:
np.random.seed(args.seed)
random.seed(args.seed)
torch.manual_seed(args.seed)
torch.cuda.manual_seed(args.seed)
torch.cuda.manual_seed_all(args.seed)
torch.use_deterministic_algorithms(True, warn_only=True)
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False # Questionable performance hit?
os.environ['PYTHONHASHSEED'] = str(args.seed)
os.environ["CUBLAS_WORKSPACE_CONFIG"] = ":16:8" # Allow CUBLAS deterministic ops.
Covers a lot more bases (a little pedantically, but being certain is better than guessing). Without setting warn_only=True on the use_deterministic_algorithms call then pytorch will throw an error when starting to execute some operations, see torch.use_deterministic_algorithms
Oh, this is great. I will try it out. Thanks!