mmselfsup icon indicating copy to clipboard operation
mmselfsup copied to clipboard

Logging SimCLR losses

Open erceguder opened this issue 3 years ago • 3 comments

Hi,

I couldn't find any issues related to this, how can I make MMSelfsup to log SimCLR (distributed) loss values?

Thanks in advance

erceguder avatar Jun 27 '22 12:06 erceguder

Did you mean the log file? Our log file in the work_dir records the loss value.

fangyixiao18 avatar Jun 27 '22 12:06 fangyixiao18

@fangyixiao18 Yes, the log file. It seems that neither .log nor .json files contains loss information.

The .log file contents:

Name of parameter - Initialization information backbone.encoder.0.0.convs.0.conv.weight - torch.Size([64, 3, 3, 3]): KaimingInit: a=0, mode=fan_out, nonlinearity=relu, distribution =normal, bias=0 backbone.encoder.0.0.convs.0.bn.weight - torch.Size([64]): The value is the same before and after calling init_weights of SimCLR ... neck.bn0.weight - torch.Size([128]): The value is the same before and after calling init_weights of SimCLR
neck.bn0.bias - torch.Size([128]): The value is the same before and after calling init_weights of SimCLR

The .json file contents:

{"env_info": "sys.platform: linux\nPython: 3.8.13 | packaged by conda-forge | (default, Mar 25 2022, 06:04:18) [GCC 10.3.0]\nCUDA available: True\nGPU 0,1,2,3: NVIDIA A10G\nCUDA_HOME: /usr/local/cuda\nNVCC: Cuda compilation tools, release 11.0, V11.0.221\nGCC: gcc (GCC) 7.3.1 20180712 (Red Hat 7.3.1-13)\nPyTorch: 1.11.0\nPyTorch compiling details: PyTorch built with: ... log_level = 'CRITICAL'\nload_from = None\nresume_from = None\nworkflow = [('train', 1)]\npersistent_workers = True\nopencv_num_threads = 0\nmp_start_method = 'fork'\nwork_dir = './work_dirs/selfsup/histoSimCLR_bs4/'\nauto_resume = False\ngpu_ids = range(0, 4)\n", "seed": 0, "exp_name": "config.py"}

erceguder avatar Jun 27 '22 13:06 erceguder

Is your training process still working? or it is stuck after the log you provided?

fangyixiao18 avatar Jun 28 '22 02:06 fangyixiao18

Closing due to inactivity, please reopen if there are any further problems.

fangyixiao18 avatar Oct 17 '22 02:10 fangyixiao18