pytorch-lightning
pytorch-lightning copied to clipboard
Log image wih WandbLogger in process other than rank zero
🚀 Feature
To log image wih WandbLogger
in the process other than rank zero.
Motivation
Image is usually not some metric that could be aggregated, it is possible to log different images in different processes and show all of them.
Pitch
Call WandbLogger.log_image
in different processes, and all of them get logged.
Alternatives
LightningModule.log
and LightningModule.log_dict
already have rank_zero_only
as an option, maybe there is a similar solution.
cc @tchaton @justusschock @awaelchli @borda @morganmcg1 @AyushExel @borisdayma @scottire @manangoel99
Hi @function2-llx ! Engineer from W&B here. The current design of the loggers in pytorch lightning for training on multiple processes is such that the main wandb run is accessible only by the rank 0 process and all other processes get a dummy object. This was done because all processes cannot share the same wandb run and this lead to multiple empty runs being created.
A workaround for now would be to call self.all_gather
in the rank 0 process on your variable with images.
Hope this helps!
@rohitgr7 can this be closed?