SLBR-Visible-Watermark-Removal icon indicating copy to clipboard operation
SLBR-Visible-Watermark-Removal copied to clipboard

How to use without mask?

Open FlotingDream opened this issue 2 years ago • 6 comments

For a task without mask (just has target, input(before watermark, after watermark)) How to use this code?

As "ground-truth watermark mask M", in your paper formula(1) is needed.

Any suggestion to no ground-truth watermark?

the watermark mask cannot get exactly, but the style is almost same, across img data.

Thx!

FlotingDream avatar Jun 30 '22 13:06 FlotingDream

I try to establish a not exactly watermark, add use it to generate my dataset, then I train based on your pre-train model, for 5 epoch.(train dataset img number: 4W, val dataset img 500, also generated not real), Then I test it on the real watermarked img (not generated), The result is better than the pre-train model, and looks like learned something, but I want to make better in real one, Any suggestion for this situation?

The train options use the train.sh options.(batch size change to 6 for the GPU memory). The train process log is as follows: --- epoch 71 get the best? although >71 get the better loss.

 "==================================== WaterMark Removal =============================================\r\n",
  "==> Start Time                                        : Thu Jun 30 19:01:12 2022\r\n",
  "==> USE GPU                                           : 0\r\n",
  "==================================== Stable Parameters =============================================\r\n",
  "==> workers                                           : 2(2)\r\n",
  "==> lr                                                : 0.001(0.001)\r\n",
  "==> dlr                                               : 0.001(0.001)\r\n",
  "==> beta1                                             : 0.9(0.9)\r\n",
  "==> beta2                                             : 0.999(0.999)\r\n",
  "==> momentum                                          : 0(0)\r\n",
  "==> weight_decay                                      : 0(0)\r\n",
  "==> gamma                                             : 0.1(0.1)\r\n",
  "==> flip                                              : 0(0)\r\n",
  "==> lambda_primary                                    : 0.01(0.01)\r\n",
  "==> lambda_mask                                       : 1(1)\r\n",
  "==> sltype                                            : vggx(vggx)\r\n",
  "==> alpha                                             : 0.5(0.5)\r\n",
  "==> sigma_decay                                       : 0(0)\r\n",
  "==> test_dir                                          : /PATH_TO_DATA_FOLDER/(/PATH_TO_DATA_FOLDER/)\r\n",
  "==> data                                              : ()\r\n",
  "==> finetune                                          : ()\r\n",
  "==> evaluate                                          : 0(0)\r\n",
  "==> data_augumentation                                : 0(0)\r\n",
  "==> debug                                             : 0(0)\r\n",
  "==> input_size                                        : 256(256)\r\n",
  "==> freq                                              : -1(-1)\r\n",
  "==> normalized_input                                  : 0(0)\r\n",
  "==> res                                               : 0(0)\r\n",
  "==> requires_grad                                     : 0(0)\r\n",
  "==> gpu                                               : 1(1)\r\n",
  "==> gpu_id                                            : 0(0)\r\n",
  "==> crop_size                                         : 256(256)\r\n",
  "==> no_flip                                           : 0(0)\r\n",
  "==> gan_norm                                          : 0(0)\r\n",
  "==> hl                                                : 0(0)\r\n",
  "==> sim_metric                                        : cos(cos)\r\n",
  "==> project_mode                                      : simple(simple)\r\n",
  "==> bg_mode                                           : res_mask(res_mask)\r\n",
  "==> k_refine                                          : 3(3)\r\n",
  "==> k_skip_stage                                      : 3(3)\r\n",
  "==================================== Changed Parameters =============================================\r\n",
  "==> nets                                              : slbr(dhn)\r\n",
  "==> models                                            : slbr(basic)\r\n",
  "==> epochs                                            : 75(30)\r\n",
  "==> start_epoch                                       : 70(0)\r\n",
  "==> train_batch                                       : 6(64)\r\n",
  "==> test_batch                                        : 1(6)\r\n",
  "==> schedule                                          : 65(5,10)\r\n",
  "==> lambda_l1                                         : 2.0(4)\r\n",
  "==> lambda_style                                      : 0.25(0)\r\n",
  "==> lambda_content                                    : 0.25(0)\r\n",
  "==> lambda_iou                                        : 0.25(0)\r\n",
  "==> dataset_dir                                       : ../input/(/PATH_TO_DATA_FOLDER/)\r\n",
  "==> checkpoint                                        : ./checkpoint(checkpoint)\r\n",
  "==> resume                                            : ./checkpoint/v1/model_best.pth.tar()\r\n",
  "==> preprocess                                        : resize(resize_crop)\r\n",
  "==> masked                                            : 1(0)\r\n",
  "==> loss_type                                         : hybrid(l2)\r\n",
  "==> dataset                                           : CLWD(clwd)\r\n",
  "==> name                                              : v1(v2)\r\n",
  "==> k_center                                          : 2(1)\r\n",
  "==> mask_mode                                         : res(cat)\r\n",
  "==> use_refine                                        : 1(0)\r\n",
  "==================================== Start Init Model  ===============================================\r\n",
  "==> creating model \r\n",
  "==> creating model [Finish]\r\n",
  "Downloading: \"https://download.pytorch.org/models/vgg16-397923af.pth\" to /root/.cache/torch/hub/checkpoints/vgg16-397923af.pth\r\n",
  "100%|████████████████████████████████████████| 528M/528M [00:34<00:00, 16.0MB/s]\r\n",
  "==> Total params: 21.39M\r\n",
  "==> Total devices: 1\r\n",
  "==> Current Checkpoint: ./checkpoint/v1\r\n",
  "=> loading checkpoint './checkpoint/v1/model_best.pth.tar'\r\n",
  "=> loaded checkpoint './checkpoint/v1/model_best.pth.tar' (epoch 70)\r\n",
  "============================ Initization Finish && Training Start =============================================\r\n",
  "\r\n",
  "Epoch: 71 | LR: 0.00010000\r\n",
  "(11/6667) Data: 0.00s | Batch: 0.857s | Total: 0:00:21 | ETA: 0:00:00 | loss L1: 0.3186 | loss Refine: 0.2481 | loss VGG: 0.4479 | loss Mask: 2.3315 | mask F1: 0.3712\r\n",
  "(111/6667) Data: 0.00s | Batch: 0.855s | Total: 0:01:47 | ETA: 0:00:00 | loss L1: 0.1842 | loss Refine: 0.1291 | loss VGG: 0.2400 | loss Mask: 0.8078 | mask F1: 0.7521\r\n",
  "(211/6667) Data: 0.00s | Batch: 0.855s | Total: 0:03:14 | ETA: 0:00:00 | loss L1: 0.1603 | loss Refine: 0.1136 | loss VGG: 0.2086 | loss Mask: 0.6088 | mask F1: 0.8115\r\n",
  "(311/6667) Data: 0.00s | Batch: 0.855s | Total: 0:04:40 | ETA: 0:00:00 | loss L1: 0.1396 | loss Refine: 0.0983 | loss VGG: 0.1832 | loss Mask: 0.5172 | mask F1: 0.8389\r\n",
  "(411/6667) Data: 0.00s | Batch: 0.854s | Total: 0:06:06 | ETA: 0:00:00 | loss L1: 0.1329 | loss Refine: 0.0930 | loss VGG: 0.1733 | loss Mask: 0.4656 | mask F1: 0.8542\r\n",
  "(511/6667) Data: 0.00s | Batch: 0.854s | Total: 0:07:32 | ETA: 0:00:00 | loss L1: 0.1263 | loss Refine: 0.0879 | loss VGG: 0.1651 | loss Mask: 0.4284 | mask F1: 0.8655\r\n",
  "(611/6667) Data: 0.00s | Batch: 0.855s | Total: 0:08:58 | ETA: 0:00:00 | loss L1: 0.1229 | loss Refine: 0.0853 | loss VGG: 0.1612 | loss Mask: 0.4040 | mask F1: 0.8734\r\n",
  "(711/6667) Data: 0.00s | Batch: 0.854s | Total: 0:10:25 | ETA: 0:00:00 | loss L1: 0.1193 | loss Refine: 0.0828 | loss VGG: 0.1570 | loss Mask: 0.3850 | mask F1: 0.8796\r\n",
  "(811/6667) Data: 0.00s | Batch: 0.862s | Total: 0:11:51 | ETA: 0:00:00 | loss L1: 0.1173 | loss Refine: 0.0812 | loss VGG: 0.1548 | loss Mask: 0.3693 | mask F1: 0.8847\r\n",
  "(911/6667) Data: 0.00s | Batch: 0.855s | Total: 0:13:18 | ETA: 0:00:00 | loss L1: 0.1137 | loss Refine: 0.0786 | loss VGG: 0.1506 | loss Mask: 0.3550 | mask F1: 0.8894\r\n",
  "(1011/6667) Data: 0.00s | Batch: 0.855s | Total: 0:14:44 | ETA: 0:00:00 | loss L1: 0.1122 | loss Refine: 0.0777 | loss VGG: 0.1484 | loss Mask: 0.3438 | mask F1: 0.8930\r\n",
  "(1111/6667) Data: 0.00s | Batch: 0.855s | Total: 0:16:10 | ETA: 0:00:00 | loss L1: 0.1100 | loss Refine: 0.0762 | loss VGG: 0.1457 | loss Mask: 0.3342 | mask F1: 0.8961\r\n",
  "(1211/6667) Data: 0.00s | Batch: 0.869s | Total: 0:17:36 | ETA: 0:00:00 | loss L1: 0.1077 | loss Refine: 0.0745 | loss VGG: 0.1431 | loss Mask: 0.3259 | mask F1: 0.8988\r\n",
  "(1311/6667) Data: 0.00s | Batch: 0.864s | Total: 0:19:02 | ETA: 0:00:00 | loss L1: 0.1056 | loss Refine: 0.0729 | loss VGG: 0.1407 | loss Mask: 0.3185 | mask F1: 0.9013\r\n",
  "(1411/6667) Data: 0.00s | Batch: 0.855s | Total: 0:20:28 | ETA: 0:00:00 | loss L1: 0.1040 | loss Refine: 0.0718 | loss VGG: 0.1389 | loss Mask: 0.3120 | mask F1: 0.9034\r\n",
  "(1511/6667) Data: 0.00s | Batch: 0.854s | Total: 0:21:55 | ETA: 0:00:00 | loss L1: 0.1024 | loss Refine: 0.0707 | loss VGG: 0.1374 | loss Mask: 0.3061 | mask F1: 0.9055\r\n",
  "(1611/6667) Data: 0.00s | Batch: 0.855s | Total: 0:23:21 | ETA: 0:00:00 | loss L1: 0.1011 | loss Refine: 0.0698 | loss VGG: 0.1360 | loss Mask: 0.3009 | mask F1: 0.9073\r\n",
  "(1711/6667) Data: 0.00s | Batch: 0.855s | Total: 0:24:47 | ETA: 0:00:00 | loss L1: 0.0994 | loss Refine: 0.0686 | loss VGG: 0.1339 | loss Mask: 0.2959 | mask F1: 0.9090\r\n",
  "(1811/6667) Data: 0.00s | Batch: 0.860s | Total: 0:26:13 | ETA: 0:00:00 | loss L1: 0.0981 | loss Refine: 0.0677 | loss VGG: 0.1324 | loss Mask: 0.2916 | mask F1: 0.9105\r\n",
  "(1911/6667) Data: 0.00s | Batch: 0.855s | Total: 0:27:39 | ETA: 0:00:00 | loss L1: 0.0967 | loss Refine: 0.0667 | loss VGG: 0.1307 | loss Mask: 0.2874 | mask F1: 0.9119\r\n",
  "(2011/6667) Data: 0.00s | Batch: 0.855s | Total: 0:29:06 | ETA: 0:00:00 | loss L1: 0.0962 | loss Refine: 0.0664 | loss VGG: 0.1303 | loss Mask: 0.2840 | mask F1: 0.9130\r\n",
  "(2111/6667) Data: 0.00s | Batch: 0.855s | Total: 0:30:32 | ETA: 0:00:00 | loss L1: 0.0955 | loss Refine: 0.0659 | loss VGG: 0.1296 | loss Mask: 0.2809 | mask F1: 0.9142\r\n",
  "(2211/6667) Data: 0.00s | Batch: 0.864s | Total: 0:31:58 | ETA: 0:00:00 | loss L1: 0.0945 | loss Refine: 0.0652 | loss VGG: 0.1284 | loss Mask: 0.2777 | mask F1: 0.9153\r\n",
  "(2311/6667) Data: 0.00s | Batch: 0.855s | Total: 0:33:24 | ETA: 0:00:00 | loss L1: 0.0937 | loss Refine: 0.0646 | loss VGG: 0.1274 | loss Mask: 0.2748 | mask F1: 0.9163\r\n",
  "(2411/6667) Data: 0.00s | Batch: 0.855s | Total: 0:34:50 | ETA: 0:00:00 | loss L1: 0.0931 | loss Refine: 0.0642 | loss VGG: 0.1268 | loss Mask: 0.2721 | mask F1: 0.9172\r\n",
  "(2511/6667) Data: 0.00s | Batch: 0.855s | Total: 0:36:16 | ETA: 0:00:00 | loss L1: 0.0923 | loss Refine: 0.0637 | loss VGG: 0.1261 | loss Mask: 0.2696 | mask F1: 0.9181\r\n",
  "(2611/6667) Data: 0.00s | Batch: 0.856s | Total: 0:37:43 | ETA: 0:00:00 | loss L1: 0.0918 | loss Refine: 0.0634 | loss VGG: 0.1255 | loss Mask: 0.2673 | mask F1: 0.9188\r\n",
  "(2711/6667) Data: 0.00s | Batch: 0.855s | Total: 0:39:09 | ETA: 0:00:00 | loss L1: 0.0910 | loss Refine: 0.0628 | loss VGG: 0.1246 | loss Mask: 0.2650 | mask F1: 0.9196\r\n",
  "(2811/6667) Data: 0.00s | Batch: 0.855s | Total: 0:40:36 | ETA: 0:00:00 | loss L1: 0.0903 | loss Refine: 0.0624 | loss VGG: 0.1239 | loss Mask: 0.2628 | mask F1: 0.9204\r\n",
  "(2911/6667) Data: 0.00s | Batch: 0.855s | Total: 0:42:02 | ETA: 0:00:00 | loss L1: 0.0898 | loss Refine: 0.0620 | loss VGG: 0.1234 | loss Mask: 0.2609 | mask F1: 0.9211\r\n",
  "(3011/6667) Data: 0.00s | Batch: 0.862s | Total: 0:43:28 | ETA: 0:00:00 | loss L1: 0.0894 | loss Refine: 0.0617 | loss VGG: 0.1229 | loss Mask: 0.2591 | mask F1: 0.9217\r\n",
  "(3111/6667) Data: 0.00s | Batch: 0.856s | Total: 0:44:54 | ETA: 0:00:00 | loss L1: 0.0887 | loss Refine: 0.0613 | loss VGG: 0.1221 | loss Mask: 0.2572 | mask F1: 0.9224\r\n",
  "(3211/6667) Data: 0.00s | Batch: 0.855s | Total: 0:46:20 | ETA: 0:00:00 | loss L1: 0.0879 | loss Refine: 0.0607 | loss VGG: 0.1212 | loss Mask: 0.2553 | mask F1: 0.9231\r\n",
  "(3311/6667) Data: 0.00s | Batch: 0.855s | Total: 0:47:47 | ETA: 0:00:00 | loss L1: 0.0874 | loss Refine: 0.0603 | loss VGG: 0.1206 | loss Mask: 0.2537 | mask F1: 0.9236\r\n",
  "(3411/6667) Data: 0.00s | Batch: 0.855s | Total: 0:49:13 | ETA: 0:00:00 | loss L1: 0.0867 | loss Refine: 0.0599 | loss VGG: 0.1198 | loss Mask: 0.2520 | mask F1: 0.9242\r\n",
  "(3511/6667) Data: 0.00s | Batch: 0.854s | Total: 0:50:39 | ETA: 0:00:00 | loss L1: 0.0861 | loss Refine: 0.0595 | loss VGG: 0.1191 | loss Mask: 0.2504 | mask F1: 0.9247\r\n",
  "(3611/6667) Data: 0.00s | Batch: 0.858s | Total: 0:52:05 | ETA: 0:00:00 | loss L1: 0.0857 | loss Refine: 0.0591 | loss VGG: 0.1187 | loss Mask: 0.2490 | mask F1: 0.9252\r\n",
  "(3711/6667) Data: 0.00s | Batch: 0.863s | Total: 0:53:32 | ETA: 0:00:00 | loss L1: 0.0850 | loss Refine: 0.0587 | loss VGG: 0.1179 | loss Mask: 0.2475 | mask F1: 0.9258\r\n",
  "(3811/6667) Data: 0.00s | Batch: 0.854s | Total: 0:54:58 | ETA: 0:00:00 | loss L1: 0.0845 | loss Refine: 0.0583 | loss VGG: 0.1175 | loss Mask: 0.2463 | mask F1: 0.9262\r\n",
  "(3911/6667) Data: 0.00s | Batch: 0.856s | Total: 0:56:24 | ETA: 0:00:00 | loss L1: 0.0840 | loss Refine: 0.0579 | loss VGG: 0.1170 | loss Mask: 0.2449 | mask F1: 0.9267\r\n",
  "(4011/6667) Data: 0.00s | Batch: 0.863s | Total: 0:57:50 | ETA: 0:00:00 | loss L1: 0.0835 | loss Refine: 0.0576 | loss VGG: 0.1165 | loss Mask: 0.2436 | mask F1: 0.9272\r\n",
  "(4111/6667) Data: 0.00s | Batch: 0.855s | Total: 0:59:16 | ETA: 0:00:00 | loss L1: 0.0831 | loss Refine: 0.0573 | loss VGG: 0.1160 | loss Mask: 0.2425 | mask F1: 0.9276\r\n",
  "(4211/6667) Data: 0.00s | Batch: 0.862s | Total: 1:00:42 | ETA: 0:00:00 | loss L1: 0.0828 | loss Refine: 0.0571 | loss VGG: 0.1156 | loss Mask: 0.2413 | mask F1: 0.9280\r\n",
  "(4311/6667) Data: 0.00s | Batch: 0.855s | Total: 1:02:09 | ETA: 0:00:00 | loss L1: 0.0825 | loss Refine: 0.0569 | loss VGG: 0.1152 | loss Mask: 0.2402 | mask F1: 0.9284\r\n",
  "(4411/6667) Data: 0.00s | Batch: 0.854s | Total: 1:03:35 | ETA: 0:00:00 | loss L1: 0.0822 | loss Refine: 0.0567 | loss VGG: 0.1149 | loss Mask: 0.2391 | mask F1: 0.9288\r\n",
  "(4511/6667) Data: 0.00s | Batch: 0.854s | Total: 1:05:01 | ETA: 0:00:00 | loss L1: 0.0818 | loss Refine: 0.0564 | loss VGG: 0.1145 | loss Mask: 0.2380 | mask F1: 0.9292\r\n",
  "(4611/6667) Data: 0.00s | Batch: 0.854s | Total: 1:06:27 | ETA: 0:00:00 | loss L1: 0.0815 | loss Refine: 0.0562 | loss VGG: 0.1144 | loss Mask: 0.2372 | mask F1: 0.9295\r\n",
  "(4711/6667) Data: 0.00s | Batch: 0.854s | Total: 1:07:53 | ETA: 0:00:00 | loss L1: 0.0812 | loss Refine: 0.0560 | loss VGG: 0.1139 | loss Mask: 0.2361 | mask F1: 0.9299\r\n",
  "(4811/6667) Data: 0.00s | Batch: 0.854s | Total: 1:09:19 | ETA: 0:00:00 | loss L1: 0.0809 | loss Refine: 0.0558 | loss VGG: 0.1136 | loss Mask: 0.2352 | mask F1: 0.9302\r\n",
  "(4911/6667) Data: 0.00s | Batch: 0.855s | Total: 1:10:46 | ETA: 0:00:00 | loss L1: 0.0806 | loss Refine: 0.0556 | loss VGG: 0.1132 | loss Mask: 0.2342 | mask F1: 0.9306\r\n",
  "(5011/6667) Data: 0.00s | Batch: 0.855s | Total: 1:12:12 | ETA: 0:00:00 | loss L1: 0.0804 | loss Refine: 0.0555 | loss VGG: 0.1130 | loss Mask: 0.2334 | mask F1: 0.9309\r\n",
  "(5111/6667) Data: 0.00s | Batch: 0.855s | Total: 1:13:38 | ETA: 0:00:00 | loss L1: 0.0802 | loss Refine: 0.0554 | loss VGG: 0.1127 | loss Mask: 0.2325 | mask F1: 0.9312\r\n",
  "(5211/6667) Data: 0.00s | Batch: 0.854s | Total: 1:15:05 | ETA: 0:00:00 | loss L1: 0.0799 | loss Refine: 0.0551 | loss VGG: 0.1123 | loss Mask: 0.2315 | mask F1: 0.9315\r\n",
  "(5311/6667) Data: 0.00s | Batch: 0.855s | Total: 1:16:31 | ETA: 0:00:00 | loss L1: 0.0797 | loss Refine: 0.0551 | loss VGG: 0.1122 | loss Mask: 0.2308 | mask F1: 0.9318\r\n",
  "(5411/6667) Data: 0.00s | Batch: 0.863s | Total: 1:17:57 | ETA: 0:00:00 | loss L1: 0.0796 | loss Refine: 0.0550 | loss VGG: 0.1121 | loss Mask: 0.2300 | mask F1: 0.9321\r\n",
  "(5511/6667) Data: 0.00s | Batch: 0.863s | Total: 1:19:23 | ETA: 0:00:00 | loss L1: 0.0794 | loss Refine: 0.0548 | loss VGG: 0.1120 | loss Mask: 0.2293 | mask F1: 0.9323\r\n",
  "(5611/6667) Data: 0.00s | Batch: 0.855s | Total: 1:20:50 | ETA: 0:00:00 | loss L1: 0.0791 | loss Refine: 0.0546 | loss VGG: 0.1116 | loss Mask: 0.2285 | mask F1: 0.9326\r\n",
  "(5711/6667) Data: 0.00s | Batch: 0.854s | Total: 1:22:16 | ETA: 0:00:00 | loss L1: 0.0788 | loss Refine: 0.0544 | loss VGG: 0.1113 | loss Mask: 0.2278 | mask F1: 0.9329\r\n",
  "(5811/6667) Data: 0.00s | Batch: 0.854s | Total: 1:23:42 | ETA: 0:00:00 | loss L1: 0.0785 | loss Refine: 0.0542 | loss VGG: 0.1110 | loss Mask: 0.2271 | mask F1: 0.9332\r\n",
  "(5911/6667) Data: 0.00s | Batch: 0.854s | Total: 1:25:08 | ETA: 0:00:00 | loss L1: 0.0783 | loss Refine: 0.0541 | loss VGG: 0.1108 | loss Mask: 0.2264 | mask F1: 0.9334\r\n",
  "(6011/6667) Data: 0.00s | Batch: 0.855s | Total: 1:26:34 | ETA: 0:00:00 | loss L1: 0.0781 | loss Refine: 0.0539 | loss VGG: 0.1105 | loss Mask: 0.2257 | mask F1: 0.9337\r\n",
  "(6111/6667) Data: 0.00s | Batch: 0.862s | Total: 1:28:00 | ETA: 0:00:00 | loss L1: 0.0779 | loss Refine: 0.0538 | loss VGG: 0.1103 | loss Mask: 0.2250 | mask F1: 0.9339\r\n",
  "(6211/6667) Data: 0.00s | Batch: 0.863s | Total: 1:29:26 | ETA: 0:00:00 | loss L1: 0.0777 | loss Refine: 0.0537 | loss VGG: 0.1101 | loss Mask: 0.2244 | mask F1: 0.9342\r\n",
  "(6311/6667) Data: 0.00s | Batch: 0.855s | Total: 1:30:52 | ETA: 0:00:00 | loss L1: 0.0774 | loss Refine: 0.0535 | loss VGG: 0.1098 | loss Mask: 0.2237 | mask F1: 0.9344\r\n",
  "(6411/6667) Data: 0.00s | Batch: 0.854s | Total: 1:32:18 | ETA: 0:00:00 | loss L1: 0.0771 | loss Refine: 0.0533 | loss VGG: 0.1095 | loss Mask: 0.2230 | mask F1: 0.9347\r\n",
  "(6511/6667) Data: 0.00s | Batch: 0.854s | Total: 1:33:45 | ETA: 0:00:00 | loss L1: 0.0769 | loss Refine: 0.0531 | loss VGG: 0.1094 | loss Mask: 0.2224 | mask F1: 0.9349\r\n",
  "(6611/6667) Data: 0.00s | Batch: 0.854s | Total: 1:35:11 | ETA: 0:00:00 | loss L1: 0.0766 | loss Refine: 0.0529 | loss VGG: 0.1090 | loss Mask: 0.2217 | mask F1: 0.9351\r\n",
  "(1/500) Data: 0.00s | Batch: 1.091s | Total: 0:00:01 | ETA: 0:00:00 | CPSNR: 49.4500 | CRMSEw: 0.8792 | PSNR: 49.4500 | fPSNR: 49.2833 | RMSE: 0.8625 | RMSEw: 0.8792 | SSIM: 0.9936 | IoU: 0.9345 | F1: 0.9661\r\n",
  "(101/500) Data: 0.00s | Batch: 0.087s | Total: 0:00:06 | ETA: 0:00:00 | CPSNR: 40.9157 | CRMSEw: 9.6712 | PSNR: 41.8865 | fPSNR: 37.8569 | RMSE: 3.4299 | RMSEw: 7.5855 | SSIM: 0.9787 | IoU: 0.9084 | F1: 0.9517\r\n",
  "(201/500) Data: 0.00s | Batch: 0.050s | Total: 0:00:11 | ETA: 0:00:00 | CPSNR: 40.6256 | CRMSEw: 9.9281 | PSNR: 41.6093 | fPSNR: 37.4722 | RMSE: 3.4847 | RMSEw: 7.7977 | SSIM: 0.9787 | IoU: 0.9075 | F1: 0.9512\r\n",
  "(301/500) Data: 0.00s | Batch: 0.050s | Total: 0:00:17 | ETA: 0:00:00 | CPSNR: 40.5516 | CRMSEw: 9.9545 | PSNR: 41.4404 | fPSNR: 37.2893 | RMSE: 3.5699 | RMSEw: 7.9938 | SSIM: 0.9793 | IoU: 0.9073 | F1: 0.9511\r\n",
  "(401/500) Data: 0.00s | Batch: 0.053s | Total: 0:00:22 | ETA: 0:00:00 | CPSNR: 40.5658 | CRMSEw: 9.9159 | PSNR: 41.4692 | fPSNR: 37.3034 | RMSE: 3.5475 | RMSEw: 7.9320 | SSIM: 0.9798 | IoU: 0.9075 | F1: 0.9513\r\n",
  "Total:\r\n",
  "(500/500) Data: 0.00s | Batch: 0.050s | Total: 0:00:27 | ETA: 0:00:00 | CPSNR: 40.5847 | CRMSEw: 9.9693 | PSNR: 41.4677 | fPSNR: 37.3216 | RMSE: 3.5773 | RMSEw: 7.9482 | SSIM: 0.9803 | IoU: 0.9075 | F1: 0.9513\r\n",
  "\r\n",
  "\u001b[?25hIter:70,losses:0,PSNR:41.4677,SSIM:0.9803\r\n",
  "Saving Best Metric with PSNR:41.467715517860725\r\n",
  "\r\n",
  "Epoch: 72 | LR: 0.00010000\r\n",
  "(44/6667) Data: 0.00s | Batch: 0.854s | Total: 0:00:38 | ETA: 0:00:00 | loss L1: 0.0565 | loss Refine: 0.0393 | loss VGG: 0.0879 | loss Mask: 0.1794 | mask F1: 0.9508\r\n",
  "(144/6667) Data: 0.00s | Batch: 0.855s | Total: 0:02:04 | ETA: 0:00:00 | loss L1: 0.0604 | loss Refine: 0.0425 | loss VGG: 0.0947 | loss Mask: 0.1828 | mask F1: 0.9499\r\n",
  "(244/6667) Data: 0.00s | Batch: 0.863s | Total: 0:03:31 | ETA: 0:00:00 | loss L1: 0.0610 | loss Refine: 0.0428 | loss VGG: 0.0961 | loss Mask: 0.1814 | mask F1: 0.9505\r\n",
  "(344/6667) Data: 0.00s | Batch: 0.855s | Total: 0:04:57 | ETA: 0:00:00 | loss L1: 0.0614 | loss Refine: 0.0427 | loss VGG: 0.0958 | loss Mask: 0.1800 | mask F1: 0.9510\r\n",
  "(444/6667) Data: 0.00s | Batch: 0.854s | Total: 0:06:23 | ETA: 0:00:00 | loss L1: 0.0606 | loss Refine: 0.0420 | loss VGG: 0.0936 | loss Mask: 0.1795 | mask F1: 0.9511\r\n",
  "(544/6667) Data: 0.00s | Batch: 0.863s | Total: 0:07:49 | ETA: 0:00:00 | loss L1: 0.0607 | loss Refine: 0.0421 | loss VGG: 0.0928 | loss Mask: 0.1793 | mask F1: 0.9510\r\n",
  "(644/6667) Data: 0.00s | Batch: 0.854s | Total: 0:09:16 | ETA: 0:00:00 | loss L1: 0.0611 | loss Refine: 0.0424 | loss VGG: 0.0931 | loss Mask: 0.1793 | mask F1: 0.9511\r\n",
  "(744/6667) Data: 0.00s | Batch: 0.854s | Total: 0:10:42 | ETA: 0:00:00 | loss L1: 0.0605 | loss Refine: 0.0418 | loss VGG: 0.0919 | loss Mask: 0.1790 | mask F1: 0.9511\r\n",
  "(844/6667) Data: 0.00s | Batch: 0.854s | Total: 0:12:08 | ETA: 0:00:00 | loss L1: 0.0606 | loss Refine: 0.0418 | loss VGG: 0.0922 | loss Mask: 0.1791 | mask F1: 0.9512\r\n",
  "(944/6667) Data: 0.00s | Batch: 0.858s | Total: 0:13:35 | ETA: 0:00:00 | loss L1: 0.0594 | loss Refine: 0.0409 | loss VGG: 0.0906 | loss Mask: 0.1783 | mask F1: 0.9515\r\n",
  "(1044/6667) Data: 0.00s | Batch: 0.854s | Total: 0:15:01 | ETA: 0:00:00 | loss L1: 0.0596 | loss Refine: 0.0411 | loss VGG: 0.0908 | loss Mask: 0.1783 | mask F1: 0.9514\r\n",
  "(1144/6667) Data: 0.00s | Batch: 0.858s | Total: 0:16:27 | ETA: 0:00:00 | loss L1: 0.0593 | loss Refine: 0.0408 | loss VGG: 0.0902 | loss Mask: 0.1779 | mask F1: 0.9515\r\n",
  "(1244/6667) Data: 0.00s | Batch: 0.855s | Total: 0:17:53 | ETA: 0:00:00 | loss L1: 0.0595 | loss Refine: 0.0411 | loss VGG: 0.0905 | loss Mask: 0.1777 | mask F1: 0.9516\r\n",
  "(1344/6667) Data: 0.00s | Batch: 0.863s | Total: 0:19:19 | ETA: 0:00:00 | loss L1: 0.0596 | loss Refine: 0.0412 | loss VGG: 0.0908 | loss Mask: 0.1777 | mask F1: 0.9517\r\n",
  "(1444/6667) Data: 0.00s | Batch: 0.855s | Total: 0:20:45 | ETA: 0:00:00 | loss L1: 0.0601 | loss Refine: 0.0415 | loss VGG: 0.0914 | loss Mask: 0.1775 | mask F1: 0.9517\r\n",
  "(1544/6667) Data: 0.00s | Batch: 0.855s | Total: 0:22:12 | ETA: 0:00:00 | loss L1: 0.0601 | loss Refine: 0.0415 | loss VGG: 0.0911 | loss Mask: 0.1773 | mask F1: 0.9518\r\n",
  "(1644/6667) Data: 0.00s | Batch: 0.861s | Total: 0:23:38 | ETA: 0:00:00 | loss L1: 0.0600 | loss Refine: 0.0414 | loss VGG: 0.0909 | loss Mask: 0.1770 | mask F1: 0.9519\r\n",
  "(1744/6667) Data: 0.00s | Batch: 0.855s | Total: 0:25:04 | ETA: 0:00:00 | loss L1: 0.0598 | loss Refine: 0.0412 | loss VGG: 0.0906 | loss Mask: 0.1769 | mask F1: 0.9520\r\n",
  "(1844/6667) Data: 0.00s | Batch: 0.855s | Total: 0:26:30 | ETA: 0:00:00 | loss L1: 0.0598 | loss Refine: 0.0412 | loss VGG: 0.0903 | loss Mask: 0.1767 | mask F1: 0.9521\r\n",
  "(1944/6667) Data: 0.00s | Batch: 0.855s | Total: 0:27:56 | ETA: 0:00:00 | loss L1: 0.0595 | loss Refine: 0.0411 | loss VGG: 0.0899 | loss Mask: 0.1766 | mask F1: 0.9522\r\n",
  "(2044/6667) Data: 0.00s | Batch: 0.856s | Total: 0:29:23 | ETA: 0:00:00 | loss L1: 0.0596 | loss Refine: 0.0411 | loss VGG: 0.0897 | loss Mask: 0.1763 | mask F1: 0.9522\r\n",
  "(2144/6667) Data: 0.00s | Batch: 0.855s | Total: 0:30:49 | ETA: 0:00:00 | loss L1: 0.0595 | loss Refine: 0.0410 | loss VGG: 0.0898 | loss Mask: 0.1761 | mask F1: 0.9523\r\n",
  "(2244/6667) Data: 0.00s | Batch: 0.855s | Total: 0:32:15 | ETA: 0:00:00 | loss L1: 0.0597 | loss Refine: 0.0411 | loss VGG: 0.0901 | loss Mask: 0.1761 | mask F1: 0.9524\r\n",
  "(2344/6667) Data: 0.00s | Batch: 0.854s | Total: 0:33:41 | ETA: 0:00:00 | loss L1: 0.0596 | loss Refine: 0.0410 | loss VGG: 0.0899 | loss Mask: 0.1759 | mask F1: 0.9525\r\n",
  "(2444/6667) Data: 0.00s | Batch: 0.855s | Total: 0:35:08 | ETA: 0:00:00 | loss L1: 0.0599 | loss Refine: 0.0412 | loss VGG: 0.0901 | loss Mask: 0.1758 | mask F1: 0.9525\r\n",
  "(2544/6667) Data: 0.00s | Batch: 0.855s | Total: 0:36:34 | ETA: 0:00:00 | loss L1: 0.0599 | loss Refine: 0.0412 | loss VGG: 0.0902 | loss Mask: 0.1756 | mask F1: 0.9526\r\n",
  "(2644/6667) Data: 0.00s | Batch: 0.863s | Total: 0:38:00 | ETA: 0:00:00 | loss L1: 0.0599 | loss Refine: 0.0412 | loss VGG: 0.0901 | loss Mask: 0.1754 | mask F1: 0.9527\r\n",
  "(2744/6667) Data: 0.00s | Batch: 0.855s | Total: 0:39:26 | ETA: 0:00:00 | loss L1: 0.0600 | loss Refine: 0.0413 | loss VGG: 0.0904 | loss Mask: 0.1753 | mask F1: 0.9527\r\n",
  "(2844/6667) Data: 0.00s | Batch: 0.863s | Total: 0:40:53 | ETA: 0:00:00 | loss L1: 0.0600 | loss Refine: 0.0412 | loss VGG: 0.0903 | loss Mask: 0.1751 | mask F1: 0.9528\r\n",
  "(2944/6667) Data: 0.00s | Batch: 0.855s | Total: 0:42:19 | ETA: 0:00:00 | loss L1: 0.0598 | loss Refine: 0.0411 | loss VGG: 0.0901 | loss Mask: 0.1750 | mask F1: 0.9529\r\n",
  "(3044/6667) Data: 0.00s | Batch: 0.854s | Total: 0:43:45 | ETA: 0:00:00 | loss L1: 0.0599 | loss Refine: 0.0412 | loss VGG: 0.0903 | loss Mask: 0.1750 | mask F1: 0.9529\r\n",
  "(3144/6667) Data: 0.00s | Batch: 0.855s | Total: 0:45:11 | ETA: 0:00:00 | loss L1: 0.0597 | loss Refine: 0.0411 | loss VGG: 0.0901 | loss Mask: 0.1748 | mask F1: 0.9530\r\n",
  "(3244/6667) Data: 0.00s | Batch: 0.856s | Total: 0:46:37 | ETA: 0:00:00 | loss L1: 0.0596 | loss Refine: 0.0410 | loss VGG: 0.0900 | loss Mask: 0.1747 | mask F1: 0.9530\r\n",
  "(3344/6667) Data: 0.00s | Batch: 0.860s | Total: 0:48:04 | ETA: 0:00:00 | loss L1: 0.0597 | loss Refine: 0.0411 | loss VGG: 0.0900 | loss Mask: 0.1745 | mask F1: 0.9531\r\n",
  "(3444/6667) Data: 0.00s | Batch: 0.857s | Total: 0:49:30 | ETA: 0:00:00 | loss L1: 0.0594 | loss Refine: 0.0409 | loss VGG: 0.0897 | loss Mask: 0.1744 | mask F1: 0.9531\r\n",
  "(3544/6667) Data: 0.00s | Batch: 0.855s | Total: 0:50:56 | ETA: 0:00:00 | loss L1: 0.0594 | loss Refine: 0.0409 | loss VGG: 0.0895 | loss Mask: 0.1742 | mask F1: 0.9532\r\n",
  "(3644/6667) Data: 0.00s | Batch: 0.864s | Total: 0:52:22 | ETA: 0:00:00 | loss L1: 0.0596 | loss Refine: 0.0410 | loss VGG: 0.0899 | loss Mask: 0.1742 | mask F1: 0.9532\r\n",
  "(3744/6667) Data: 0.00s | Batch: 0.855s | Total: 0:53:49 | ETA: 0:00:00 | loss L1: 0.0594 | loss Refine: 0.0409 | loss VGG: 0.0895 | loss Mask: 0.1740 | mask F1: 0.9533\r\n",
  "(3844/6667) Data: 0.00s | Batch: 0.855s | Total: 0:55:15 | ETA: 0:00:00 | loss L1: 0.0593 | loss Refine: 0.0408 | loss VGG: 0.0894 | loss Mask: 0.1738 | mask F1: 0.9533\r\n",
  "(3944/6667) Data: 0.00s | Batch: 0.854s | Total: 0:56:41 | ETA: 0:00:00 | loss L1: 0.0592 | loss Refine: 0.0408 | loss VGG: 0.0894 | loss Mask: 0.1737 | mask F1: 0.9534\r\n",
  "(4044/6667) Data: 0.00s | Batch: 0.855s | Total: 0:58:07 | ETA: 0:00:00 | loss L1: 0.0591 | loss Refine: 0.0407 | loss VGG: 0.0893 | loss Mask: 0.1735 | mask F1: 0.9535\r\n",
  "(4144/6667) Data: 0.00s | Batch: 0.855s | Total: 0:59:33 | ETA: 0:00:00 | loss L1: 0.0590 | loss Refine: 0.0406 | loss VGG: 0.0890 | loss Mask: 0.1734 | mask F1: 0.9535\r\n",
  "(4244/6667) Data: 0.00s | Batch: 0.874s | Total: 1:00:59 | ETA: 0:00:00 | loss L1: 0.0589 | loss Refine: 0.0405 | loss VGG: 0.0889 | loss Mask: 0.1732 | mask F1: 0.9536\r\n",
  "(4344/6667) Data: 0.00s | Batch: 0.855s | Total: 1:02:25 | ETA: 0:00:00 | loss L1: 0.0588 | loss Refine: 0.0404 | loss VGG: 0.0888 | loss Mask: 0.1731 | mask F1: 0.9537\r\n",
  "(4444/6667) Data: 0.00s | Batch: 0.854s | Total: 1:03:51 | ETA: 0:00:00 | loss L1: 0.0587 | loss Refine: 0.0404 | loss VGG: 0.0888 | loss Mask: 0.1729 | mask F1: 0.9537\r\n",
  "(4544/6667) Data: 0.00s | Batch: 0.856s | Total: 1:05:18 | ETA: 0:00:00 | loss L1: 0.0586 | loss Refine: 0.0403 | loss VGG: 0.0886 | loss Mask: 0.1728 | mask F1: 0.9538\r\n",
  "(4644/6667) Data: 0.00s | Batch: 0.857s | Total: 1:06:44 | ETA: 0:00:00 | loss L1: 0.0586 | loss Refine: 0.0403 | loss VGG: 0.0886 | loss Mask: 0.1727 | mask F1: 0.9539\r\n",
  "(4744/6667) Data: 0.00s | Batch: 0.854s | Total: 1:08:10 | ETA: 0:00:00 | loss L1: 0.0583 | loss Refine: 0.0401 | loss VGG: 0.0883 | loss Mask: 0.1725 | mask F1: 0.9539\r\n",
  "(4844/6667) Data: 0.00s | Batch: 0.856s | Total: 1:09:36 | ETA: 0:00:00 | loss L1: 0.0583 | loss Refine: 0.0401 | loss VGG: 0.0883 | loss Mask: 0.1723 | mask F1: 0.9540\r\n",
  "(4944/6667) Data: 0.00s | Batch: 0.855s | Total: 1:11:02 | ETA: 0:00:00 | loss L1: 0.0583 | loss Refine: 0.0401 | loss VGG: 0.0884 | loss Mask: 0.1723 | mask F1: 0.9540\r\n",
  "(5044/6667) Data: 0.00s | Batch: 0.855s | Total: 1:12:28 | ETA: 0:00:00 | loss L1: 0.0582 | loss Refine: 0.0401 | loss VGG: 0.0883 | loss Mask: 0.1722 | mask F1: 0.9541\r\n",
  "(5144/6667) Data: 0.00s | Batch: 0.854s | Total: 1:13:54 | ETA: 0:00:00 | loss L1: 0.0581 | loss Refine: 0.0400 | loss VGG: 0.0882 | loss Mask: 0.1720 | mask F1: 0.9541\r\n",
  "(5244/6667) Data: 0.00s | Batch: 0.855s | Total: 1:15:21 | ETA: 0:00:00 | loss L1: 0.0580 | loss Refine: 0.0399 | loss VGG: 0.0880 | loss Mask: 0.1718 | mask F1: 0.9542\r\n",
  "(5344/6667) Data: 0.00s | Batch: 0.865s | Total: 1:16:47 | ETA: 0:00:00 | loss L1: 0.0579 | loss Refine: 0.0399 | loss VGG: 0.0879 | loss Mask: 0.1717 | mask F1: 0.9543\r\n",
  "(5444/6667) Data: 0.00s | Batch: 0.872s | Total: 1:18:13 | ETA: 0:00:00 | loss L1: 0.0580 | loss Refine: 0.0399 | loss VGG: 0.0879 | loss Mask: 0.1716 | mask F1: 0.9543\r\n",
  "(5544/6667) Data: 0.00s | Batch: 0.855s | Total: 1:19:39 | ETA: 0:00:00 | loss L1: 0.0582 | loss Refine: 0.0400 | loss VGG: 0.0881 | loss Mask: 0.1715 | mask F1: 0.9543\r\n",
  "(5644/6667) Data: 0.00s | Batch: 0.855s | Total: 1:21:05 | ETA: 0:00:00 | loss L1: 0.0582 | loss Refine: 0.0400 | loss VGG: 0.0882 | loss Mask: 0.1714 | mask F1: 0.9544\r\n",
  "(5744/6667) Data: 0.00s | Batch: 0.855s | Total: 1:22:32 | ETA: 0:00:00 | loss L1: 0.0581 | loss Refine: 0.0400 | loss VGG: 0.0881 | loss Mask: 0.1713 | mask F1: 0.9544\r\n",
  "(5844/6667) Data: 0.00s | Batch: 0.855s | Total: 1:23:58 | ETA: 0:00:00 | loss L1: 0.0582 | loss Refine: 0.0400 | loss VGG: 0.0882 | loss Mask: 0.1711 | mask F1: 0.9545\r\n",
  "(5944/6667) Data: 0.00s | Batch: 0.866s | Total: 1:25:25 | ETA: 0:00:00 | loss L1: 0.0582 | loss Refine: 0.0400 | loss VGG: 0.0883 | loss Mask: 0.1711 | mask F1: 0.9545\r\n",
  "(6044/6667) Data: 0.00s | Batch: 0.855s | Total: 1:26:51 | ETA: 0:00:00 | loss L1: 0.0581 | loss Refine: 0.0400 | loss VGG: 0.0882 | loss Mask: 0.1710 | mask F1: 0.9546\r\n",
  "(6144/6667) Data: 0.00s | Batch: 0.855s | Total: 1:28:17 | ETA: 0:00:00 | loss L1: 0.0581 | loss Refine: 0.0399 | loss VGG: 0.0882 | loss Mask: 0.1708 | mask F1: 0.9546\r\n",
  "(6244/6667) Data: 0.00s | Batch: 0.876s | Total: 1:29:43 | ETA: 0:00:00 | loss L1: 0.0581 | loss Refine: 0.0399 | loss VGG: 0.0882 | loss Mask: 0.1707 | mask F1: 0.9547\r\n",
  "(6344/6667) Data: 0.00s | Batch: 0.856s | Total: 1:31:09 | ETA: 0:00:00 | loss L1: 0.0580 | loss Refine: 0.0399 | loss VGG: 0.0881 | loss Mask: 0.1706 | mask F1: 0.9547\r\n",
  "(6444/6667) Data: 0.00s | Batch: 0.855s | Total: 1:32:35 | ETA: 0:00:00 | loss L1: 0.0580 | loss Refine: 0.0399 | loss VGG: 0.0881 | loss Mask: 0.1705 | mask F1: 0.9548\r\n",
  "(6544/6667) Data: 0.00s | Batch: 0.855s | Total: 1:34:02 | ETA: 0:00:00 | loss L1: 0.0579 | loss Refine: 0.0398 | loss VGG: 0.0880 | loss Mask: 0.1704 | mask F1: 0.9548\r\n",
  "(6644/6667) Data: 0.00s | Batch: 0.855s | Total: 1:35:28 | ETA: 0:00:00 | loss L1: 0.0579 | loss Refine: 0.0398 | loss VGG: 0.0879 | loss Mask: 0.1703 | mask F1: 0.9549\r\n",
  "(1/500) Data: 0.00s | Batch: 0.278s | Total: 0:00:00 | ETA: 0:00:00 | CPSNR: 49.2353 | CRMSEw: 0.8036 | PSNR: 49.2353 | fPSNR: 50.0638 | RMSE: 0.8840 | RMSEw: 0.8036 | SSIM: 0.9931 | IoU: 0.9449 | F1: 0.9716\r\n",
  "(101/500) Data: 0.00s | Batch: 0.058s | Total: 0:00:05 | ETA: 0:00:00 | CPSNR: 40.8388 | CRMSEw: 9.1313 | PSNR: 41.7142 | fPSNR: 38.1961 | RMSE: 3.2501 | RMSEw: 6.9901 | SSIM: 0.9769 | IoU: 0.9228 | F1: 0.9596\r\n",
  "(201/500) Data: 0.00s | Batch: 0.050s | Total: 0:00:11 | ETA: 0:00:00 | CPSNR: 40.4948 | CRMSEw: 9.3897 | PSNR: 41.3828 | fPSNR: 37.7619 | RMSE: 3.3463 | RMSEw: 7.2591 | SSIM: 0.9770 | IoU: 0.9215 | F1: 0.9589\r\n",
  "(301/500) Data: 0.00s | Batch: 0.055s | Total: 0:00:16 | ETA: 0:00:00 | CPSNR: 40.4324 | CRMSEw: 9.3647 | PSNR: 41.2522 | fPSNR: 37.6404 | RMSE: 3.4016 | RMSEw: 7.3573 | SSIM: 0.9777 | IoU: 0.9211 | F1: 0.9587\r\n",
  "(401/500) Data: 0.00s | Batch: 0.051s | Total: 0:00:22 | ETA: 0:00:00 | CPSNR: 40.4032 | CRMSEw: 9.3707 | PSNR: 41.2519 | fPSNR: 37.6203 | RMSE: 3.3967 | RMSEw: 7.3338 | SSIM: 0.9782 | IoU: 0.9213 | F1: 0.9588\r\n",
  "Total:\r\n",
  "(500/500) Data: 0.00s | Batch: 0.050s | Total: 0:00:27 | ETA: 0:00:00 | CPSNR: 40.4026 | CRMSEw: 9.4328 | PSNR: 41.2362 | fPSNR: 37.6273 | RMSE: 3.4435 | RMSEw: 7.4006 | SSIM: 0.9788 | IoU: 0.9213 | F1: 0.9588\r\n",
  "\r\n",
  "\u001b[?25hIter:71,losses:0,PSNR:41.2362,SSIM:0.9788\r\n",
  "\r\n",
  "Epoch: 73 | LR: 0.00010000\r\n",
  "(77/6667) Data: 0.00s | Batch: 0.855s | Total: 0:01:06 | ETA: 0:00:00 | loss L1: 0.0506 | loss Refine: 0.0356 | loss VGG: 0.0826 | loss Mask: 0.1633 | mask F1: 0.9581\r\n",
  "(177/6667) Data: 0.00s | Batch: 0.855s | Total: 0:02:33 | ETA: 0:00:00 | loss L1: 0.0526 | loss Refine: 0.0361 | loss VGG: 0.0837 | loss Mask: 0.1635 | mask F1: 0.9582\r\n",
  "(277/6667) Data: 0.00s | Batch: 0.856s | Total: 0:03:59 | ETA: 0:00:00 | loss L1: 0.0532 | loss Refine: 0.0366 | loss VGG: 0.0836 | loss Mask: 0.1618 | mask F1: 0.9587\r\n",
  "(377/6667) Data: 0.00s | Batch: 0.858s | Total: 0:05:25 | ETA: 0:00:00 | loss L1: 0.0518 | loss Refine: 0.0354 | loss VGG: 0.0826 | loss Mask: 0.1615 | mask F1: 0.9590\r\n",
  "(477/6667) Data: 0.00s | Batch: 0.855s | Total: 0:06:51 | ETA: 0:00:00 | loss L1: 0.0514 | loss Refine: 0.0353 | loss VGG: 0.0817 | loss Mask: 0.1611 | mask F1: 0.9590\r\n",
  "(577/6667) Data: 0.00s | Batch: 0.855s | Total: 0:08:17 | ETA: 0:00:00 | loss L1: 0.0517 | loss Refine: 0.0354 | loss VGG: 0.0814 | loss Mask: 0.1611 | mask F1: 0.9589\r\n",
  "(677/6667) Data: 0.00s | Batch: 0.856s | Total: 0:09:44 | ETA: 0:00:00 | loss L1: 0.0527 | loss Refine: 0.0362 | loss VGG: 0.0821 | loss Mask: 0.1610 | mask F1: 0.9589\r\n",
  "(777/6667) Data: 0.00s | Batch: 0.855s | Total: 0:11:10 | ETA: 0:00:00 | loss L1: 0.0531 | loss Refine: 0.0363 | loss VGG: 0.0826 | loss Mask: 0.1611 | mask F1: 0.9590\r\n",
  "(877/6667) Data: 0.00s | Batch: 0.856s | Total: 0:12:36 | ETA: 0:00:00 | loss L1: 0.0535 | loss Refine: 0.0366 | loss VGG: 0.0827 | loss Mask: 0.1611 | mask F1: 0.9590\r\n",
  "(977/6667) Data: 0.00s | Batch: 0.862s | Total: 0:14:03 | ETA: 0:00:00 | loss L1: 0.0540 | loss Refine: 0.0369 | loss VGG: 0.0835 | loss Mask: 0.1613 | mask F1: 0.9589\r\n",
  "(1077/6667) Data: 0.00s | Batch: 0.855s | Total: 0:15:29 | ETA: 0:00:00 | loss L1: 0.0545 | loss Refine: 0.0373 | loss VGG: 0.0841 | loss Mask: 0.1615 | mask F1: 0.9588\r\n",
  "(1177/6667) Data: 0.00s | Batch: 0.855s | Total: 0:16:55 | ETA: 0:00:00 | loss L1: 0.0547 | loss Refine: 0.0374 | loss VGG: 0.0847 | loss Mask: 0.1616 | mask F1: 0.9588\r\n",
  "(1277/6667) Data: 0.00s | Batch: 0.855s | Total: 0:18:21 | ETA: 0:00:00 | loss L1: 0.0549 | loss Refine: 0.0376 | loss VGG: 0.0850 | loss Mask: 0.1615 | mask F1: 0.9589\r\n",
  "(1377/6667) Data: 0.00s | Batch: 0.855s | Total: 0:19:47 | ETA: 0:00:00 | loss L1: 0.0546 | loss Refine: 0.0375 | loss VGG: 0.0846 | loss Mask: 0.1613 | mask F1: 0.9589\r\n",
  "(1477/6667) Data: 0.00s | Batch: 0.856s | Total: 0:21:14 | ETA: 0:00:00 | loss L1: 0.0546 | loss Refine: 0.0374 | loss VGG: 0.0842 | loss Mask: 0.1611 | mask F1: 0.9590\r\n",
  "(1577/6667) Data: 0.00s | Batch: 0.862s | Total: 0:22:40 | ETA: 0:00:00 | loss L1: 0.0546 | loss Refine: 0.0374 | loss VGG: 0.0842 | loss Mask: 0.1611 | mask F1: 0.9590\r\n",
  "(1677/6667) Data: 0.00s | Batch: 0.854s | Total: 0:24:06 | ETA: 0:00:00 | loss L1: 0.0548 | loss Refine: 0.0376 | loss VGG: 0.0846 | loss Mask: 0.1610 | mask F1: 0.9591\r\n",
  "(1777/6667) Data: 0.00s | Batch: 0.863s | Total: 0:25:32 | ETA: 0:00:00 | loss L1: 0.0549 | loss Refine: 0.0375 | loss VGG: 0.0849 | loss Mask: 0.1609 | mask F1: 0.9591\r\n",
  "(1877/6667) Data: 0.00s | Batch: 0.855s | Total: 0:26:59 | ETA: 0:00:00 | loss L1: 0.0548 | loss Refine: 0.0374 | loss VGG: 0.0846 | loss Mask: 0.1608 | mask F1: 0.9591\r\n",
  "(1977/6667) Data: 0.00s | Batch: 0.854s | Total: 0:28:25 | ETA: 0:00:00 | loss L1: 0.0547 | loss Refine: 0.0373 | loss VGG: 0.0844 | loss Mask: 0.1606 | mask F1: 0.9592\r\n",
  "(2077/6667) Data: 0.00s | Batch: 0.855s | Total: 0:29:51 | ETA: 0:00:00 | loss L1: 0.0548 | loss Refine: 0.0374 | loss VGG: 0.0846 | loss Mask: 0.1606 | mask F1: 0.9592\r\n",
  "(2177/6667) Data: 0.00s | Batch: 0.855s | Total: 0:31:17 | ETA: 0:00:00 | loss L1: 0.0546 | loss Refine: 0.0373 | loss VGG: 0.0842 | loss Mask: 0.1604 | mask F1: 0.9593\r\n",
  "(2277/6667) Data: 0.00s | Batch: 0.855s | Total: 0:32:43 | ETA: 0:00:00 | loss L1: 0.0549 | loss Refine: 0.0376 | loss VGG: 0.0847 | loss Mask: 0.1604 | mask F1: 0.9593\r\n",
  "(2377/6667) Data: 0.00s | Batch: 0.855s | Total: 0:34:10 | ETA: 0:00:00 | loss L1: 0.0548 | loss Refine: 0.0375 | loss VGG: 0.0847 | loss Mask: 0.1603 | mask F1: 0.9594\r\n",
  "(2477/6667) Data: 0.00s | Batch: 0.856s | Total: 0:35:36 | ETA: 0:00:00 | loss L1: 0.0549 | loss Refine: 0.0376 | loss VGG: 0.0849 | loss Mask: 0.1603 | mask F1: 0.9594\r\n",
  "(2577/6667) Data: 0.00s | Batch: 0.855s | Total: 0:37:02 | ETA: 0:00:00 | loss L1: 0.0547 | loss Refine: 0.0374 | loss VGG: 0.0846 | loss Mask: 0.1603 | mask F1: 0.9594\r\n",
  "(2677/6667) Data: 0.00s | Batch: 0.855s | Total: 0:38:28 | ETA: 0:00:00 | loss L1: 0.0549 | loss Refine: 0.0376 | loss VGG: 0.0848 | loss Mask: 0.1602 | mask F1: 0.9595\r\n",
  "(2777/6667) Data: 0.00s | Batch: 0.855s | Total: 0:39:54 | ETA: 0:00:00 | loss L1: 0.0551 | loss Refine: 0.0377 | loss VGG: 0.0850 | loss Mask: 0.1602 | mask F1: 0.9595\r\n",
  "(2877/6667) Data: 0.00s | Batch: 0.855s | Total: 0:41:20 | ETA: 0:00:00 | loss L1: 0.0551 | loss Refine: 0.0377 | loss VGG: 0.0851 | loss Mask: 0.1601 | mask F1: 0.9595\r\n",
  "(2977/6667) Data: 0.00s | Batch: 0.855s | Total: 0:42:47 | ETA: 0:00:00 | loss L1: 0.0551 | loss Refine: 0.0377 | loss VGG: 0.0852 | loss Mask: 0.1601 | mask F1: 0.9595\r\n",
  "(3077/6667) Data: 0.00s | Batch: 0.855s | Total: 0:44:13 | ETA: 0:00:00 | loss L1: 0.0551 | loss Refine: 0.0377 | loss VGG: 0.0851 | loss Mask: 0.1601 | mask F1: 0.9595\r\n",
  "(3177/6667) Data: 0.00s | Batch: 0.855s | Total: 0:45:39 | ETA: 0:00:00 | loss L1: 0.0550 | loss Refine: 0.0377 | loss VGG: 0.0850 | loss Mask: 0.1599 | mask F1: 0.9596\r\n",
  "(3277/6667) Data: 0.00s | Batch: 0.855s | Total: 0:47:06 | ETA: 0:00:00 | loss L1: 0.0549 | loss Refine: 0.0376 | loss VGG: 0.0848 | loss Mask: 0.1598 | mask F1: 0.9596\r\n",
  "(3377/6667) Data: 0.00s | Batch: 0.855s | Total: 0:48:32 | ETA: 0:00:00 | loss L1: 0.0549 | loss Refine: 0.0376 | loss VGG: 0.0847 | loss Mask: 0.1598 | mask F1: 0.9596\r\n",
  "(3477/6667) Data: 0.00s | Batch: 0.856s | Total: 0:49:58 | ETA: 0:00:00 | loss L1: 0.0548 | loss Refine: 0.0375 | loss VGG: 0.0845 | loss Mask: 0.1597 | mask F1: 0.9597\r\n",
  "(3577/6667) Data: 0.00s | Batch: 0.855s | Total: 0:51:24 | ETA: 0:00:00 | loss L1: 0.0548 | loss Refine: 0.0375 | loss VGG: 0.0844 | loss Mask: 0.1596 | mask F1: 0.9597\r\n",
  "(3677/6667) Data: 0.00s | Batch: 0.855s | Total: 0:52:50 | ETA: 0:00:00 | loss L1: 0.0547 | loss Refine: 0.0374 | loss VGG: 0.0844 | loss Mask: 0.1595 | mask F1: 0.9598\r\n",
  "(3777/6667) Data: 0.00s | Batch: 0.863s | Total: 0:54:16 | ETA: 0:00:00 | loss L1: 0.0545 | loss Refine: 0.0373 | loss VGG: 0.0843 | loss Mask: 0.1594 | mask F1: 0.9598\r\n",
  "(3877/6667) Data: 0.00s | Batch: 0.855s | Total: 0:55:43 | ETA: 0:00:00 | loss L1: 0.0544 | loss Refine: 0.0372 | loss VGG: 0.0841 | loss Mask: 0.1593 | mask F1: 0.9599\r\n",
  "(3977/6667) Data: 0.00s | Batch: 0.855s | Total: 0:57:09 | ETA: 0:00:00 | loss L1: 0.0546 | loss Refine: 0.0373 | loss VGG: 0.0844 | loss Mask: 0.1593 | mask F1: 0.9599\r\n",
  "(4077/6667) Data: 0.00s | Batch: 0.855s | Total: 0:58:35 | ETA: 0:00:00 | loss L1: 0.0546 | loss Refine: 0.0373 | loss VGG: 0.0845 | loss Mask: 0.1593 | mask F1: 0.9599\r\n",
  "(4177/6667) Data: 0.00s | Batch: 0.856s | Total: 1:00:02 | ETA: 0:00:00 | loss L1: 0.0546 | loss Refine: 0.0373 | loss VGG: 0.0844 | loss Mask: 0.1593 | mask F1: 0.9599\r\n",
  "(4277/6667) Data: 0.00s | Batch: 0.855s | Total: 1:01:28 | ETA: 0:00:00 | loss L1: 0.0545 | loss Refine: 0.0372 | loss VGG: 0.0844 | loss Mask: 0.1592 | mask F1: 0.9600\r\n",
  "(4377/6667) Data: 0.00s | Batch: 0.859s | Total: 1:02:54 | ETA: 0:00:00 | loss L1: 0.0544 | loss Refine: 0.0371 | loss VGG: 0.0842 | loss Mask: 0.1590 | mask F1: 0.9600\r\n",
  "(4477/6667) Data: 0.00s | Batch: 0.855s | Total: 1:04:20 | ETA: 0:00:00 | loss L1: 0.0544 | loss Refine: 0.0371 | loss VGG: 0.0843 | loss Mask: 0.1590 | mask F1: 0.9601\r\n",
  "(4577/6667) Data: 0.00s | Batch: 0.863s | Total: 1:05:46 | ETA: 0:00:00 | loss L1: 0.0543 | loss Refine: 0.0371 | loss VGG: 0.0842 | loss Mask: 0.1589 | mask F1: 0.9601\r\n",
  "(4677/6667) Data: 0.00s | Batch: 0.869s | Total: 1:07:13 | ETA: 0:00:00 | loss L1: 0.0543 | loss Refine: 0.0371 | loss VGG: 0.0841 | loss Mask: 0.1588 | mask F1: 0.9601\r\n",
  "(4777/6667) Data: 0.00s | Batch: 0.854s | Total: 1:08:39 | ETA: 0:00:00 | loss L1: 0.0543 | loss Refine: 0.0371 | loss VGG: 0.0841 | loss Mask: 0.1588 | mask F1: 0.9602\r\n",
  "(4877/6667) Data: 0.00s | Batch: 0.854s | Total: 1:10:05 | ETA: 0:00:00 | loss L1: 0.0542 | loss Refine: 0.0370 | loss VGG: 0.0840 | loss Mask: 0.1587 | mask F1: 0.9602\r\n",
  "(4977/6667) Data: 0.00s | Batch: 0.858s | Total: 1:11:31 | ETA: 0:00:00 | loss L1: 0.0541 | loss Refine: 0.0370 | loss VGG: 0.0839 | loss Mask: 0.1586 | mask F1: 0.9602\r\n",
  "(5077/6667) Data: 0.00s | Batch: 0.856s | Total: 1:12:58 | ETA: 0:00:00 | loss L1: 0.0540 | loss Refine: 0.0369 | loss VGG: 0.0837 | loss Mask: 0.1585 | mask F1: 0.9603\r\n",
  "(5177/6667) Data: 0.00s | Batch: 0.855s | Total: 1:14:24 | ETA: 0:00:00 | loss L1: 0.0538 | loss Refine: 0.0368 | loss VGG: 0.0835 | loss Mask: 0.1583 | mask F1: 0.9603\r\n",
  "(5277/6667) Data: 0.00s | Batch: 0.867s | Total: 1:15:50 | ETA: 0:00:00 | loss L1: 0.0537 | loss Refine: 0.0367 | loss VGG: 0.0834 | loss Mask: 0.1583 | mask F1: 0.9604\r\n",
  "(5377/6667) Data: 0.00s | Batch: 0.855s | Total: 1:17:16 | ETA: 0:00:00 | loss L1: 0.0536 | loss Refine: 0.0366 | loss VGG: 0.0832 | loss Mask: 0.1581 | mask F1: 0.9604\r\n",
  "(5477/6667) Data: 0.00s | Batch: 0.855s | Total: 1:18:42 | ETA: 0:00:00 | loss L1: 0.0537 | loss Refine: 0.0367 | loss VGG: 0.0833 | loss Mask: 0.1581 | mask F1: 0.9605\r\n",
  "(5577/6667) Data: 0.00s | Batch: 0.855s | Total: 1:20:09 | ETA: 0:00:00 | loss L1: 0.0536 | loss Refine: 0.0367 | loss VGG: 0.0833 | loss Mask: 0.1581 | mask F1: 0.9605\r\n",
  "(5677/6667) Data: 0.00s | Batch: 0.856s | Total: 1:21:35 | ETA: 0:00:00 | loss L1: 0.0535 | loss Refine: 0.0366 | loss VGG: 0.0832 | loss Mask: 0.1580 | mask F1: 0.9605\r\n",
  "(5777/6667) Data: 0.00s | Batch: 0.854s | Total: 1:23:01 | ETA: 0:00:00 | loss L1: 0.0535 | loss Refine: 0.0365 | loss VGG: 0.0831 | loss Mask: 0.1579 | mask F1: 0.9606\r\n",
  "(5877/6667) Data: 0.00s | Batch: 0.874s | Total: 1:24:27 | ETA: 0:00:00 | loss L1: 0.0534 | loss Refine: 0.0365 | loss VGG: 0.0831 | loss Mask: 0.1579 | mask F1: 0.9606\r\n",
  "(5977/6667) Data: 0.00s | Batch: 0.854s | Total: 1:25:54 | ETA: 0:00:00 | loss L1: 0.0534 | loss Refine: 0.0365 | loss VGG: 0.0830 | loss Mask: 0.1578 | mask F1: 0.9606\r\n",
  "(6077/6667) Data: 0.00s | Batch: 0.856s | Total: 1:27:20 | ETA: 0:00:00 | loss L1: 0.0534 | loss Refine: 0.0365 | loss VGG: 0.0829 | loss Mask: 0.1577 | mask F1: 0.9607\r\n",
  "(6177/6667) Data: 0.00s | Batch: 0.855s | Total: 1:28:46 | ETA: 0:00:00 | loss L1: 0.0533 | loss Refine: 0.0365 | loss VGG: 0.0828 | loss Mask: 0.1576 | mask F1: 0.9607\r\n",
  "(6277/6667) Data: 0.00s | Batch: 0.855s | Total: 1:30:12 | ETA: 0:00:00 | loss L1: 0.0533 | loss Refine: 0.0365 | loss VGG: 0.0828 | loss Mask: 0.1575 | mask F1: 0.9607\r\n",
  "(6377/6667) Data: 0.00s | Batch: 0.855s | Total: 1:31:38 | ETA: 0:00:00 | loss L1: 0.0533 | loss Refine: 0.0365 | loss VGG: 0.0828 | loss Mask: 0.1575 | mask F1: 0.9608\r\n",
  "(6477/6667) Data: 0.00s | Batch: 0.856s | Total: 1:33:05 | ETA: 0:00:00 | loss L1: 0.0533 | loss Refine: 0.0365 | loss VGG: 0.0828 | loss Mask: 0.1574 | mask F1: 0.9608\r\n",
  "(6577/6667) Data: 0.00s | Batch: 0.855s | Total: 1:34:31 | ETA: 0:00:00 | loss L1: 0.0533 | loss Refine: 0.0365 | loss VGG: 0.0828 | loss Mask: 0.1573 | mask F1: 0.9608\r\n",
  "(1/500) Data: 0.00s | Batch: 0.478s | Total: 0:00:00 | ETA: 0:00:00 | CPSNR: 48.8406 | CRMSEw: 0.8466 | PSNR: 48.8406 | fPSNR: 49.6115 | RMSE: 0.9251 | RMSEw: 0.8466 | SSIM: 0.9923 | IoU: 0.9509 | F1: 0.9748\r\n",
  "(101/500) Data: 0.00s | Batch: 0.055s | Total: 0:00:05 | ETA: 0:00:00 | CPSNR: 40.8838 | CRMSEw: 8.4407 | PSNR: 41.7286 | fPSNR: 38.7296 | RMSE: 3.1111 | RMSEw: 6.4443 | SSIM: 0.9767 | IoU: 0.9317 | F1: 0.9644\r\n",
  "(201/500) Data: 0.00s | Batch: 0.069s | Total: 0:00:11 | ETA: 0:00:00 | CPSNR: 40.5346 | CRMSEw: 8.7209 | PSNR: 41.3795 | fPSNR: 38.2634 | RMSE: 3.2065 | RMSEw: 6.7197 | SSIM: 0.9770 | IoU: 0.9302 | F1: 0.9636\r\n",
  "(301/500) Data: 0.00s | Batch: 0.053s | Total: 0:00:16 | ETA: 0:00:00 | CPSNR: 40.4669 | CRMSEw: 8.6830 | PSNR: 41.2441 | fPSNR: 38.1521 | RMSE: 3.2616 | RMSEw: 6.8211 | SSIM: 0.9777 | IoU: 0.9297 | F1: 0.9633\r\n",
  "(401/500) Data: 0.00s | Batch: 0.094s | Total: 0:00:22 | ETA: 0:00:00 | CPSNR: 40.4576 | CRMSEw: 8.6613 | PSNR: 41.2592 | fPSNR: 38.1461 | RMSE: 3.2497 | RMSEw: 6.7836 | SSIM: 0.9781 | IoU: 0.9299 | F1: 0.9635\r\n",
  "Total:\r\n",
  "(500/500) Data: 0.00s | Batch: 0.050s | Total: 0:00:27 | ETA: 0:00:00 | CPSNR: 40.4484 | CRMSEw: 8.7095 | PSNR: 41.2281 | fPSNR: 38.1427 | RMSE: 3.2953 | RMSEw: 6.8515 | SSIM: 0.9787 | IoU: 0.9300 | F1: 0.9635\r\n",
  "\r\n",
  "\u001b[?25hIter:72,losses:0,PSNR:41.2281,SSIM:0.9787\r\n",
  "\r\n",
  "Epoch: 74 | LR: 0.00010000\r\n",
  "(10/6667) Data: 0.00s | Batch: 0.855s | Total: 0:00:09 | ETA: 0:00:00 | loss L1: 0.0570 | loss Refine: 0.0373 | loss VGG: 0.0786 | loss Mask: 0.1509 | mask F1: 0.9631\r\n",
  "(110/6667) Data: 0.00s | Batch: 0.855s | Total: 0:01:36 | ETA: 0:00:00 | loss L1: 0.0507 | loss Refine: 0.0346 | loss VGG: 0.0770 | loss Mask: 0.1513 | mask F1: 0.9634\r\n",
  "(210/6667) Data: 0.00s | Batch: 0.855s | Total: 0:03:02 | ETA: 0:00:00 | loss L1: 0.0510 | loss Refine: 0.0349 | loss VGG: 0.0791 | loss Mask: 0.1523 | mask F1: 0.9635\r\n",
  "(310/6667) Data: 0.00s | Batch: 0.863s | Total: 0:04:28 | ETA: 0:00:00 | loss L1: 0.0516 | loss Refine: 0.0350 | loss VGG: 0.0799 | loss Mask: 0.1521 | mask F1: 0.9634\r\n",
  "(410/6667) Data: 0.00s | Batch: 0.855s | Total: 0:05:54 | ETA: 0:00:00 | loss L1: 0.0506 | loss Refine: 0.0345 | loss VGG: 0.0789 | loss Mask: 0.1518 | mask F1: 0.9635\r\n",
  "(510/6667) Data: 0.00s | Batch: 0.856s | Total: 0:07:20 | ETA: 0:00:00 | loss L1: 0.0516 | loss Refine: 0.0353 | loss VGG: 0.0803 | loss Mask: 0.1518 | mask F1: 0.9634\r\n",
  "(610/6667) Data: 0.00s | Batch: 0.856s | Total: 0:08:47 | ETA: 0:00:00 | loss L1: 0.0527 | loss Refine: 0.0359 | loss VGG: 0.0819 | loss Mask: 0.1524 | mask F1: 0.9633\r\n",
  "(710/6667) Data: 0.00s | Batch: 0.855s | Total: 0:10:13 | ETA: 0:00:00 | loss L1: 0.0524 | loss Refine: 0.0356 | loss VGG: 0.0819 | loss Mask: 0.1524 | mask F1: 0.9633\r\n",
  "(810/6667) Data: 0.00s | Batch: 0.861s | Total: 0:11:39 | ETA: 0:00:00 | loss L1: 0.0535 | loss Refine: 0.0364 | loss VGG: 0.0831 | loss Mask: 0.1527 | mask F1: 0.9632\r\n",
  "(910/6667) Data: 0.00s | Batch: 0.871s | Total: 0:13:06 | ETA: 0:00:00 | loss L1: 0.0537 | loss Refine: 0.0365 | loss VGG: 0.0833 | loss Mask: 0.1529 | mask F1: 0.9631\r\n",
  "(1010/6667) Data: 0.00s | Batch: 0.855s | Total: 0:14:32 | ETA: 0:00:00 | loss L1: 0.0539 | loss Refine: 0.0368 | loss VGG: 0.0835 | loss Mask: 0.1528 | mask F1: 0.9632\r\n",
  "(1110/6667) Data: 0.00s | Batch: 0.855s | Total: 0:15:58 | ETA: 0:00:00 | loss L1: 0.0535 | loss Refine: 0.0365 | loss VGG: 0.0831 | loss Mask: 0.1527 | mask F1: 0.9632\r\n",
  "(1210/6667) Data: 0.00s | Batch: 0.863s | Total: 0:17:24 | ETA: 0:00:00 | loss L1: 0.0536 | loss Refine: 0.0366 | loss VGG: 0.0831 | loss Mask: 0.1527 | mask F1: 0.9632\r\n",
  "(1310/6667) Data: 0.00s | Batch: 0.861s | Total: 0:18:50 | ETA: 0:00:00 | loss L1: 0.0536 | loss Refine: 0.0366 | loss VGG: 0.0830 | loss Mask: 0.1527 | mask F1: 0.9632\r\n",
  "(1410/6667) Data: 0.00s | Batch: 0.857s | Total: 0:20:16 | ETA: 0:00:00 | loss L1: 0.0533 | loss Refine: 0.0364 | loss VGG: 0.0828 | loss Mask: 0.1527 | mask F1: 0.9632\r\n",
  "(1510/6667) Data: 0.00s | Batch: 0.856s | Total: 0:21:43 | ETA: 0:00:00 | loss L1: 0.0532 | loss Refine: 0.0363 | loss VGG: 0.0826 | loss Mask: 0.1525 | mask F1: 0.9632\r\n",
  "(1610/6667) Data: 0.00s | Batch: 0.856s | Total: 0:23:09 | ETA: 0:00:00 | loss L1: 0.0530 | loss Refine: 0.0362 | loss VGG: 0.0826 | loss Mask: 0.1525 | mask F1: 0.9633\r\n",
  "(1710/6667) Data: 0.00s | Batch: 0.855s | Total: 0:24:35 | ETA: 0:00:00 | loss L1: 0.0528 | loss Refine: 0.0361 | loss VGG: 0.0823 | loss Mask: 0.1525 | mask F1: 0.9633\r\n",
  "(1810/6667) Data: 0.00s | Batch: 0.855s | Total: 0:26:01 | ETA: 0:00:00 | loss L1: 0.0528 | loss Refine: 0.0360 | loss VGG: 0.0820 | loss Mask: 0.1523 | mask F1: 0.9633\r\n",
  "(1910/6667) Data: 0.00s | Batch: 0.855s | Total: 0:27:27 | ETA: 0:00:00 | loss L1: 0.0526 | loss Refine: 0.0359 | loss VGG: 0.0818 | loss Mask: 0.1521 | mask F1: 0.9634\r\n",
  "(2010/6667) Data: 0.00s | Batch: 0.855s | Total: 0:28:53 | ETA: 0:00:00 | loss L1: 0.0528 | loss Refine: 0.0360 | loss VGG: 0.0821 | loss Mask: 0.1520 | mask F1: 0.9634\r\n",
  "(2110/6667) Data: 0.00s | Batch: 0.888s | Total: 0:30:20 | ETA: 0:00:00 | loss L1: 0.0525 | loss Refine: 0.0358 | loss VGG: 0.0818 | loss Mask: 0.1518 | mask F1: 0.9635\r\n",
  "(2210/6667) Data: 0.00s | Batch: 0.855s | Total: 0:31:46 | ETA: 0:00:00 | loss L1: 0.0522 | loss Refine: 0.0356 | loss VGG: 0.0814 | loss Mask: 0.1517 | mask F1: 0.9635\r\n",
  "(2310/6667) Data: 0.00s | Batch: 0.855s | Total: 0:33:12 | ETA: 0:00:00 | loss L1: 0.0520 | loss Refine: 0.0355 | loss VGG: 0.0812 | loss Mask: 0.1516 | mask F1: 0.9636\r\n",
  "(2410/6667) Data: 0.00s | Batch: 0.854s | Total: 0:34:38 | ETA: 0:00:00 | loss L1: 0.0520 | loss Refine: 0.0354 | loss VGG: 0.0813 | loss Mask: 0.1516 | mask F1: 0.9636\r\n",
  "(2510/6667) Data: 0.00s | Batch: 0.855s | Total: 0:36:05 | ETA: 0:00:00 | loss L1: 0.0520 | loss Refine: 0.0355 | loss VGG: 0.0813 | loss Mask: 0.1516 | mask F1: 0.9636\r\n",
  "(2610/6667) Data: 0.00s | Batch: 0.854s | Total: 0:37:31 | ETA: 0:00:00 | loss L1: 0.0518 | loss Refine: 0.0354 | loss VGG: 0.0812 | loss Mask: 0.1515 | mask F1: 0.9636\r\n",
  "(2710/6667) Data: 0.00s | Batch: 0.863s | Total: 0:38:57 | ETA: 0:00:00 | loss L1: 0.0517 | loss Refine: 0.0353 | loss VGG: 0.0809 | loss Mask: 0.1513 | mask F1: 0.9637\r\n",
  "(2810/6667) Data: 0.00s | Batch: 0.862s | Total: 0:40:23 | ETA: 0:00:00 | loss L1: 0.0516 | loss Refine: 0.0352 | loss VGG: 0.0809 | loss Mask: 0.1512 | mask F1: 0.9637\r\n",
  "(2910/6667) Data: 0.00s | Batch: 0.865s | Total: 0:41:49 | ETA: 0:00:00 | loss L1: 0.0516 | loss Refine: 0.0352 | loss VGG: 0.0807 | loss Mask: 0.1511 | mask F1: 0.9637\r\n",
  "(3010/6667) Data: 0.00s | Batch: 0.855s | Total: 0:43:15 | ETA: 0:00:00 | loss L1: 0.0514 | loss Refine: 0.0351 | loss VGG: 0.0804 | loss Mask: 0.1510 | mask F1: 0.9638\r\n",
  "(3110/6667) Data: 0.00s | Batch: 0.855s | Total: 0:44:42 | ETA: 0:00:00 | loss L1: 0.0512 | loss Refine: 0.0350 | loss VGG: 0.0801 | loss Mask: 0.1509 | mask F1: 0.9638\r\n",
  "(3210/6667) Data: 0.00s | Batch: 0.855s | Total: 0:46:08 | ETA: 0:00:00 | loss L1: 0.0510 | loss Refine: 0.0349 | loss VGG: 0.0799 | loss Mask: 0.1509 | mask F1: 0.9638\r\n",
  "(3310/6667) Data: 0.00s | Batch: 0.862s | Total: 0:47:34 | ETA: 0:00:00 | loss L1: 0.0509 | loss Refine: 0.0348 | loss VGG: 0.0798 | loss Mask: 0.1509 | mask F1: 0.9639\r\n",
  "(3410/6667) Data: 0.00s | Batch: 0.855s | Total: 0:49:00 | ETA: 0:00:00 | loss L1: 0.0510 | loss Refine: 0.0348 | loss VGG: 0.0798 | loss Mask: 0.1508 | mask F1: 0.9639\r\n",
  "(3510/6667) Data: 0.00s | Batch: 0.858s | Total: 0:50:26 | ETA: 0:00:00 | loss L1: 0.0509 | loss Refine: 0.0348 | loss VGG: 0.0798 | loss Mask: 0.1508 | mask F1: 0.9639\r\n",
  "(3610/6667) Data: 0.00s | Batch: 0.855s | Total: 0:51:53 | ETA: 0:00:00 | loss L1: 0.0507 | loss Refine: 0.0347 | loss VGG: 0.0797 | loss Mask: 0.1508 | mask F1: 0.9639\r\n",
  "(3710/6667) Data: 0.00s | Batch: 0.856s | Total: 0:53:19 | ETA: 0:00:00 | loss L1: 0.0509 | loss Refine: 0.0348 | loss VGG: 0.0799 | loss Mask: 0.1508 | mask F1: 0.9639\r\n",
  "(3810/6667) Data: 0.00s | Batch: 0.854s | Total: 0:54:45 | ETA: 0:00:00 | loss L1: 0.0509 | loss Refine: 0.0348 | loss VGG: 0.0800 | loss Mask: 0.1508 | mask F1: 0.9639\r\n",
  "(3910/6667) Data: 0.00s | Batch: 0.866s | Total: 0:56:11 | ETA: 0:00:00 | loss L1: 0.0510 | loss Refine: 0.0349 | loss VGG: 0.0801 | loss Mask: 0.1508 | mask F1: 0.9640\r\n",
  "(4010/6667) Data: 0.00s | Batch: 0.855s | Total: 0:57:38 | ETA: 0:00:00 | loss L1: 0.0509 | loss Refine: 0.0348 | loss VGG: 0.0801 | loss Mask: 0.1507 | mask F1: 0.9640\r\n",
  "(4110/6667) Data: 0.00s | Batch: 0.855s | Total: 0:59:04 | ETA: 0:00:00 | loss L1: 0.0508 | loss Refine: 0.0348 | loss VGG: 0.0800 | loss Mask: 0.1507 | mask F1: 0.9640\r\n",
  "(4210/6667) Data: 0.00s | Batch: 0.855s | Total: 1:00:30 | ETA: 0:00:00 | loss L1: 0.0509 | loss Refine: 0.0348 | loss VGG: 0.0800 | loss Mask: 0.1506 | mask F1: 0.9640\r\n",
  "(4310/6667) Data: 0.00s | Batch: 0.855s | Total: 1:01:56 | ETA: 0:00:00 | loss L1: 0.0510 | loss Refine: 0.0349 | loss VGG: 0.0801 | loss Mask: 0.1506 | mask F1: 0.9640\r\n",
  "(4410/6667) Data: 0.00s | Batch: 0.855s | Total: 1:03:22 | ETA: 0:00:00 | loss L1: 0.0509 | loss Refine: 0.0348 | loss VGG: 0.0800 | loss Mask: 0.1505 | mask F1: 0.9641\r\n",
  "(4510/6667) Data: 0.00s | Batch: 0.863s | Total: 1:04:48 | ETA: 0:00:00 | loss L1: 0.0508 | loss Refine: 0.0347 | loss VGG: 0.0798 | loss Mask: 0.1504 | mask F1: 0.9641\r\n",
  "(4610/6667) Data: 0.00s | Batch: 0.855s | Total: 1:06:14 | ETA: 0:00:00 | loss L1: 0.0507 | loss Refine: 0.0347 | loss VGG: 0.0797 | loss Mask: 0.1504 | mask F1: 0.9641\r\n",
  "(4710/6667) Data: 0.00s | Batch: 0.855s | Total: 1:07:41 | ETA: 0:00:00 | loss L1: 0.0506 | loss Refine: 0.0346 | loss VGG: 0.0797 | loss Mask: 0.1503 | mask F1: 0.9642\r\n",
  "(4810/6667) Data: 0.00s | Batch: 0.854s | Total: 1:09:07 | ETA: 0:00:00 | loss L1: 0.0506 | loss Refine: 0.0346 | loss VGG: 0.0797 | loss Mask: 0.1503 | mask F1: 0.9642\r\n",
  "(4910/6667) Data: 0.00s | Batch: 0.855s | Total: 1:10:33 | ETA: 0:00:00 | loss L1: 0.0507 | loss Refine: 0.0347 | loss VGG: 0.0798 | loss Mask: 0.1503 | mask F1: 0.9642\r\n",
  "(5010/6667) Data: 0.00s | Batch: 0.858s | Total: 1:11:59 | ETA: 0:00:00 | loss L1: 0.0507 | loss Refine: 0.0347 | loss VGG: 0.0799 | loss Mask: 0.1503 | mask F1: 0.9642\r\n",
  "(5110/6667) Data: 0.00s | Batch: 0.874s | Total: 1:13:25 | ETA: 0:00:00 | loss L1: 0.0507 | loss Refine: 0.0346 | loss VGG: 0.0799 | loss Mask: 0.1502 | mask F1: 0.9642\r\n",
  "(5210/6667) Data: 0.00s | Batch: 0.856s | Total: 1:14:51 | ETA: 0:00:00 | loss L1: 0.0506 | loss Refine: 0.0346 | loss VGG: 0.0799 | loss Mask: 0.1502 | mask F1: 0.9642\r\n",
  "(5310/6667) Data: 0.00s | Batch: 0.855s | Total: 1:16:17 | ETA: 0:00:00 | loss L1: 0.0505 | loss Refine: 0.0345 | loss VGG: 0.0797 | loss Mask: 0.1501 | mask F1: 0.9643\r\n",
  "(5410/6667) Data: 0.00s | Batch: 0.863s | Total: 1:17:43 | ETA: 0:00:00 | loss L1: 0.0505 | loss Refine: 0.0346 | loss VGG: 0.0797 | loss Mask: 0.1501 | mask F1: 0.9643\r\n",
  "(5510/6667) Data: 0.00s | Batch: 0.854s | Total: 1:19:09 | ETA: 0:00:00 | loss L1: 0.0504 | loss Refine: 0.0345 | loss VGG: 0.0796 | loss Mask: 0.1500 | mask F1: 0.9643\r\n",
  "(5610/6667) Data: 0.00s | Batch: 0.855s | Total: 1:20:35 | ETA: 0:00:00 | loss L1: 0.0504 | loss Refine: 0.0345 | loss VGG: 0.0797 | loss Mask: 0.1500 | mask F1: 0.9643\r\n",
  "(5710/6667) Data: 0.00s | Batch: 0.856s | Total: 1:22:01 | ETA: 0:00:00 | loss L1: 0.0505 | loss Refine: 0.0346 | loss VGG: 0.0797 | loss Mask: 0.1500 | mask F1: 0.9643\r\n",
  "(5810/6667) Data: 0.00s | Batch: 0.855s | Total: 1:23:28 | ETA: 0:00:00 | loss L1: 0.0505 | loss Refine: 0.0346 | loss VGG: 0.0797 | loss Mask: 0.1500 | mask F1: 0.9643\r\n",
  "(5910/6667) Data: 0.00s | Batch: 0.854s | Total: 1:24:54 | ETA: 0:00:00 | loss L1: 0.0505 | loss Refine: 0.0346 | loss VGG: 0.0797 | loss Mask: 0.1499 | mask F1: 0.9643\r\n",
  "(6010/6667) Data: 0.00s | Batch: 0.854s | Total: 1:26:20 | ETA: 0:00:00 | loss L1: 0.0503 | loss Refine: 0.0345 | loss VGG: 0.0795 | loss Mask: 0.1498 | mask F1: 0.9644\r\n",
  "(6110/6667) Data: 0.00s | Batch: 0.854s | Total: 1:27:46 | ETA: 0:00:00 | loss L1: 0.0504 | loss Refine: 0.0345 | loss VGG: 0.0795 | loss Mask: 0.1498 | mask F1: 0.9644\r\n",
  "(6210/6667) Data: 0.00s | Batch: 0.855s | Total: 1:29:12 | ETA: 0:00:00 | loss L1: 0.0504 | loss Refine: 0.0345 | loss VGG: 0.0796 | loss Mask: 0.1498 | mask F1: 0.9644\r\n",
  "(6310/6667) Data: 0.00s | Batch: 0.855s | Total: 1:30:38 | ETA: 0:00:00 | loss L1: 0.0504 | loss Refine: 0.0345 | loss VGG: 0.0796 | loss Mask: 0.1497 | mask F1: 0.9644\r\n",
  "(6410/6667) Data: 0.00s | Batch: 0.863s | Total: 1:32:05 | ETA: 0:00:00 | loss L1: 0.0504 | loss Refine: 0.0345 | loss VGG: 0.0795 | loss Mask: 0.1497 | mask F1: 0.9644\r\n",
  "(6510/6667) Data: 0.00s | Batch: 0.856s | Total: 1:33:31 | ETA: 0:00:00 | loss L1: 0.0504 | loss Refine: 0.0346 | loss VGG: 0.0795 | loss Mask: 0.1497 | mask F1: 0.9644\r\n",
  "(6610/6667) Data: 0.00s | Batch: 0.855s | Total: 1:34:57 | ETA: 0:00:00 | loss L1: 0.0505 | loss Refine: 0.0346 | loss VGG: 0.0797 | loss Mask: 0.1497 | mask F1: 0.9644\r\n",
  "(1/500) Data: 0.00s | Batch: 0.573s | Total: 0:00:00 | ETA: 0:00:00 | CPSNR: 48.5071 | CRMSEw: 0.9182 | PSNR: 48.5071 | fPSNR: 48.9063 | RMSE: 0.9614 | RMSEw: 0.9182 | SSIM: 0.9917 | IoU: 0.9526 | F1: 0.9757\r\n",
  "(101/500) Data: 0.00s | Batch: 0.051s | Total: 0:00:05 | ETA: 0:00:00 | CPSNR: 40.6985 | CRMSEw: 8.2244 | PSNR: 41.6000 | fPSNR: 38.7856 | RMSE: 3.0578 | RMSEw: 6.1317 | SSIM: 0.9759 | IoU: 0.9366 | F1: 0.9671\r\n",
  "(201/500) Data: 0.00s | Batch: 0.073s | Total: 0:00:11 | ETA: 0:00:00 | CPSNR: 40.3627 | CRMSEw: 8.5041 | PSNR: 41.2479 | fPSNR: 38.3134 | RMSE: 3.1748 | RMSEw: 6.4679 | SSIM: 0.9761 | IoU: 0.9348 | F1: 0.9661\r\n",
  "(301/500) Data: 0.00s | Batch: 0.050s | Total: 0:00:16 | ETA: 0:00:00 | CPSNR: 40.3184 | CRMSEw: 8.4379 | PSNR: 41.1256 | fPSNR: 38.2075 | RMSE: 3.2183 | RMSEw: 6.5484 | SSIM: 0.9769 | IoU: 0.9343 | F1: 0.9658\r\n",
  "(401/500) Data: 0.00s | Batch: 0.050s | Total: 0:00:21 | ETA: 0:00:00 | CPSNR: 40.3263 | CRMSEw: 8.3947 | PSNR: 41.1414 | fPSNR: 38.1993 | RMSE: 3.2067 | RMSEw: 6.5172 | SSIM: 0.9774 | IoU: 0.9345 | F1: 0.9659\r\n",
  "Total:\r\n",
  "(500/500) Data: 0.00s | Batch: 0.050s | Total: 0:00:27 | ETA: 0:00:00 | CPSNR: 40.3219 | CRMSEw: 8.4248 | PSNR: 41.1081 | fPSNR: 38.1906 | RMSE: 3.2515 | RMSEw: 6.5925 | SSIM: 0.9779 | IoU: 0.9346 | F1: 0.9660\r\n",
  "\r\n",
  "\u001b[?25hIter:73,losses:0,PSNR:41.1081,SSIM:0.9779\r\n",
  "\r\n",
  "Epoch: 75 | LR: 0.00010000\r\n",
  "(43/6667) Data: 0.00s | Batch: 0.855s | Total: 0:00:43 | ETA: 0:00:00 | loss L1: 0.0691 | loss Refine: 0.0493 | loss VGG: 0.1027 | loss Mask: 0.1526 | mask F1: 0.9633\r\n"
 ]
}

]

`

FlotingDream avatar Jul 01 '22 03:07 FlotingDream

For a task without mask (just has target, input(before watermark, after watermark)) How to use this code?

As "ground-truth watermark mask M", in your paper formula(1) is needed.

Any suggestion to no ground-truth watermark?

the watermark mask cannot get exactly, but the style is almost same, across img data.

Thx!

  1. You can compute the error map by abs(Input - Target) if there is no difference between input and target except for the watermarked area. In this way, you can transfer the error map into mask by setting a threshold.
  2. To get better result, it is recommended that make more watermarked data, in which the style is similar to you test watermark data.

jimleungjing avatar Jul 01 '22 15:07 jimleungjing

For a task without mask (just has target, input(before watermark, after watermark)) How to use this code? As "ground-truth watermark mask M", in your paper formula(1) is needed. Any suggestion to no ground-truth watermark? the watermark mask cannot get exactly, but the style is almost same, across img data. Thx!

  1. You can compute the error map by abs(Input - Target) if there is no difference between input and target except for the watermarked area. In this way, you can transfer the error map into mask by setting a threshold.
  2. To get better result, it is recommended that make more watermarked data, in which the style is similar to you test watermark data.

Thx for your quick reply,
1.I meet a problem, when I train in Kaggle, the error (batch size set 6 or 4 also getting this error) Your notebook tried to allocate more memory than is available.0 appeared after about three or four epoch, the gpu time is not reach the quota. Any idea? or some code bug for memory leak? 2. In 'transfer the error map into mask by setting a threshold' , how to choose proper threshold? 3. If the Input or Target is gray (or some part near white), the mask (abs(Input - Target)) show some visible wrong, for Input white part. (as watermark near gary, no color). Is this normal? How to handle?

Thx! :)

FlotingDream avatar Jul 02 '22 03:07 FlotingDream

For a task without mask (just has target, input(before watermark, after watermark)) How to use this code? As "ground-truth watermark mask M", in your paper formula(1) is needed. Any suggestion to no ground-truth watermark? the watermark mask cannot get exactly, but the style is almost same, across img data. Thx!

  1. You can compute the error map by abs(Input - Target) if there is no difference between input and target except for the watermarked area. In this way, you can transfer the error map into mask by setting a threshold.
  2. To get better result, it is recommended that make more watermarked data, in which the style is similar to you test watermark data.

Thx for your quick reply, 1.I meet a problem, when I train in Kaggle, the error (batch size set 6 or 4 also getting this error) Your notebook tried to allocate more memory than is available.0 appeared after about three or four epoch, the gpu time is not reach the quota. Any idea? or some code bug for memory leak? 2. In 'transfer the error map into mask by setting a threshold' , how to choose proper threshold? 3. If the Input or Target is gray (or some part near white), the mask (abs(Input - Target)) show some visible wrong, for Input white part. (as watermark near gary, no color). Is this normal? How to handle?

Thx! :)

  1. You can clean the gpu memory by torch.torch.cuda.empty_cache() at the end of training of each epoch.
  2. The threshold is based on the observation that the generated mask covers almost the watermarked area.
  3. In this case, it is hard to obtain the mask, it is better to use dilated mask or label the data by yourself. For example, we preprocess the gray data by abs(Input - Target) in LVW dataset, and we dilate some mask to cover more watermarked area.

jimleungjing avatar Jul 04 '22 15:07 jimleungjing

For a task without mask (just has target, input(before watermark, after watermark)) How to use this code? As "ground-truth watermark mask M", in your paper formula(1) is needed. Any suggestion to no ground-truth watermark? the watermark mask cannot get exactly, but the style is almost same, across img data. Thx!

  1. You can compute the error map by abs(Input - Target) if there is no difference between input and target except for the watermarked area. In this way, you can transfer the error map into mask by setting a threshold.
  2. To get better result, it is recommended that make more watermarked data, in which the style is similar to you test watermark data.

Thx for your quick reply, 1.I meet a problem, when I train in Kaggle, the error (batch size set 6 or 4 also getting this error) Your notebook tried to allocate more memory than is available.0 appeared after about three or four epoch, the gpu time is not reach the quota. Any idea? or some code bug for memory leak? 2. In 'transfer the error map into mask by setting a threshold' , how to choose proper threshold? 3. If the Input or Target is gray (or some part near white), the mask (abs(Input - Target)) show some visible wrong, for Input white part. (as watermark near gary, no color). Is this normal? How to handle? Thx! :)

  1. You can clean the gpu memory by torch.torch.cuda.empty_cache() at the end of training of each epoch.
  2. The threshold is based on the observation that the generated mask covers almost the watermarked area.
  3. In this case, it is hard to obtain the mask, it is better to use dilated mask or label the data by yourself. For example, we preprocess the gray data by abs(Input - Target) in LVW dataset, and we dilate some mask to cover more watermarked area.

Thx, I will try it.

By the way, there are a lot of hyper parameters,

77.8s 140 ==> lambda_primary : 0.01(0.01)
77.8s 141 ==> lambda_mask : 1(1)
77.8s 183 ==> lambda_l1 : 2.0(4)
77.8s 184 ==> lambda_style : 0.25(0)
77.8s 185 ==> lambda_content : 0.25(0)
77.8s 186 ==> lambda_iou : 0.25(0)
77.8s 167 ==> k_refine : 3(3)
77.8s 168 ==> k_skip_stage : 3(3)
77.8s 195 ==> k_center : 2(1)
77.8s 197 ==> use_refine : 1(0)
77.8s 191 ==> masked : 1(0)
77.8s 192 ==> loss_type : hybrid(l2)

I try in my dataset, after about 5 epoch, it cannot go done, the lr=2e-4, any suggestions for fine tune?

train: loss L1: 0.0932 | loss Refine: 0.0677 | loss VGG: 0.2302 | loss Mask: 1.8610 | mask F1: 0.9637

val: CPSNR: 36.7021 | CRMSEw: 11.2144 | PSNR: 38.1385 | fPSNR: 31.8476 | RMSE: 3.9769 | RMSEw: 8.9621 | SSIM: 0.9709 | IoU: 0.9363 | F1: 0.9651

Thx!

FlotingDream avatar Jul 05 '22 03:07 FlotingDream

It seems that the performance is good. To further fine tune, you can improve the network or add more data to make your model robust.

jimleungjing avatar Jul 09 '22 10:07 jimleungjing