pytorch-deepglr icon indicating copy to clipboard operation
pytorch-deepglr copied to clipboard

can not be converged

Open robotzheng opened this issue 5 years ago • 5 comments

nohup python3 -u train_DGLR.py dataset/train/ -n dglr1 -d ./ -w 324 -e 200 -b 100 -l 2e-4 >exp1.txt &

CUDA: True Total image patches: 405 Fri Apr 3 16:41:06 2020 [1] loss: 9261402.918, time elapsed: 84.3124577999115 Fri Apr 3 16:42:36 2020 [2] loss: 157221596364.800, time elapsed: 174.34020447731018 Fri Apr 3 16:44:07 2020 [3] loss: 1291956587008.000, time elapsed: 265.53936982154846 Fri Apr 3 16:45:38 2020 [4] loss: 1099865539379.200, time elapsed: 356.13393092155457 Fri Apr 3 16:47:09 2020 [5] loss: 9093684228497820.000, time elapsed: 447.34976029396057 Fri Apr 3 16:48:40 2020 [6] loss: 5346967163174.400, time elapsed: 538.3937747478485 Fri Apr 3 16:50:11 2020 [7] loss: 3660099438182.400, time elapsed: 629.5381162166595 Fri Apr 3 16:51:42 2020 [8] loss: 459320365260.800, time elapsed: 720.6094479560852 Fri Apr 3 16:53:13 2020 [9] loss: 1239552095436.800, time elapsed: 811.699675321579 Fri Apr 3 16:54:45 2020 [10] loss: 50846929510.400, time elapsed: 902.8164310455322 save @ epoch 10 Fri Apr 3 16:56:16 2020 [11] loss: 492139066263.575, time elapsed: 993.9461359977722 Fri Apr 3 16:57:47 2020 [12] loss: 459477565440.000, time elapsed: 1084.9796614646912 Fri Apr 3 16:59:18 2020 [13] loss: 291126666854.400, time elapsed: 1175.985060453415 Fri Apr 3 17:00:49 2020 [14] loss: 2559865579315.200, time elapsed: 1267.0888183116913 Fri Apr 3 17:02:20 2020 [15] loss: 21165851726336.000, time elapsed: 1358.1607525348663 Fri Apr 3 17:03:51 2020 [16] loss: 2323966969190.400, time elapsed: 1449.2021663188934 Fri Apr 3 17:05:22 2020 [17] loss: 914318078003.200, time elapsed: 1540.2490532398224 Fri Apr 3 17:06:53 2020 [18] loss: 85701711462.400, time elapsed: 1631.2495872974396 Fri Apr 3 17:08:24 2020 [19] loss: 71346559633946.500, time elapsed: 1722.2238075733185 Fri Apr 3 17:09:53 2020 [20] loss: 93988771840.000, time elapsed: 1811.4157123565674 save @ epoch 20 Fri Apr 3 17:11:24 2020 [21] loss: 1912039229030.400, time elapsed: 1902.5226488113403 Fri Apr 3 17:12:55 2020 [22] loss: 647459373465.600, time elapsed: 1993.4862999916077 Fri Apr 3 17:14:26 2020 [23] loss: 407940377395.200, time elapsed: 2084.4309010505676 Fri Apr 3 17:15:57 2020 [24] loss: 32367696511795.199, time elapsed: 2175.438264608383 Fri Apr 3 17:17:28 2020 [25] loss: 373118316544.000, time elapsed: 2266.4489636421204 Fri Apr 3 17:18:55 2020 [26] loss: 61081027942.400, time elapsed: 2353.2528257369995 Fri Apr 3 17:20:25 2020 [27] loss: 396101484544.000, time elapsed: 2443.499759912491 Fri Apr 3 17:21:56 2020 [28] loss: 725608193536.000, time elapsed: 2534.613247156143 Fri Apr 3 17:23:27 2020 [29] loss: 129731337225.200, time elapsed: 2625.7542798519135 Fri Apr 3 17:24:59 2020 [30] loss: 2526546442035.200, time elapsed: 2716.904239654541 save @ epoch 30 Fri Apr 3 17:26:30 2020 [31] loss: 848475979461.600, time elapsed: 2808.0334856510162 Fri Apr 3 17:27:59 2020 [32] loss: 149909784576.000, time elapsed: 2897.5095703601837 Fri Apr 3 17:29:30 2020 [33] loss: 279022619238.400, time elapsed: 2988.2605011463165 Fri Apr 3 17:31:01 2020 [34] loss: 267336174899.200, time elapsed: 3078.951763153076 Fri Apr 3 17:32:31 2020 [35] loss: 320743993769.600, time elapsed: 3169.4588356018066 Fri Apr 3 17:34:02 2020 [36] loss: 7321587460096.000, time elapsed: 3260.175270318985 Fri Apr 3 17:35:32 2020 [37] loss: 57061461401.600, time elapsed: 3350.769533395767 Fri Apr 3 17:37:03 2020 [38] loss: 662424383897.600, time elapsed: 3441.397003889084 Fri Apr 3 17:38:34 2020 [39] loss: 58169954311168.000, time elapsed: 3532.053467988968 Fri Apr 3 17:40:03 2020 [40] loss: 28055001508352.000, time elapsed: 3621.261314868927 save @ epoch 40 Fri Apr 3 17:41:34 2020 [41] loss: 156931445206.800, time elapsed: 3712.2143893241882 Fri Apr 3 21:26:06 2020 [188] loss: 169965260729.600, time elapsed: 17184.11761212349 Fri Apr 3 21:27:37 2020 [189] loss: 9980132449561.600, time elapsed: 17275.408327817917 Fri Apr 3 21:29:08 2020 [190] loss: 9775844112512.000, time elapsed: 17366.748306274414 . . . save @ epoch 190 Fri Apr 3 21:30:40 2020 [191] loss: 21434649557606.398, time elapsed: 17457.88618659973 Fri Apr 3 21:32:11 2020 [192] loss: 90311157376.000, time elapsed: 17549.17462539673 Fri Apr 3 21:33:42 2020 [193] loss: 54351227494.400, time elapsed: 17640.454423666 Fri Apr 3 21:35:13 2020 [194] loss: 289435463526.400, time elapsed: 17731.729730844498 Fri Apr 3 21:36:45 2020 [195] loss: 720402100473.600, time elapsed: 17823.023690223694 Fri Apr 3 21:38:16 2020 [196] loss: 364105779827.200, time elapsed: 17913.783826828003 Fri Apr 3 21:39:45 2020 [197] loss: 27386050972.800, time elapsed: 18003.28243279457 Fri Apr 3 21:41:16 2020 [198] loss: 269100614976.000, time elapsed: 18094.70627975464 Fri Apr 3 21:42:48 2020 [199] loss: 1884024337203.200, time elapsed: 18186.141695022583 Fri Apr 3 21:44:19 2020 [200] loss: 257582522958.913, time elapsed: 18277.56715655327 save @ epoch 200 Total running time: 18277.604

python3 validate_DGLR.py dataset/test/ -m ./dglr1 -w 324 -o ./ CUDA: True, device: cuda torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 75.12231421470642 PSNR: -58.428629316809264 SSIM: 0.18655050665524675 Saved ./0.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 150.3275306224823 PSNR: -57.81696152976053 SSIM: 0.2988179192666644 Saved ./1.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 225.31568574905396 PSNR: -55.77439533681813 SSIM: 0.3555602375657634 Saved ./2.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 300.8893086910248 PSNR: -54.86550250498196 SSIM: 0.42377748241714147 Saved ./3.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 376.6220762729645 PSNR: -54.156079415464056 SSIM: 0.3259296222616892 Saved ./4.png Mean PSNR: -56.208 Mean SSIM: 0.318 Total running time: 376.771

robotzheng avatar Apr 07 '20 05:04 robotzheng

Hello robotzheng,

Are you training a DeepGLR from scratch? If so, I'd recommend to train single GLRs first, then stack them manually and use them as the starting point of DeepGLR. It is noted in the README that training a DeepGLR from scratch would require a lot of hyperparameter tuning and it is easier to train separate single GLRs first.

Best.

huyvd7 avatar Apr 07 '20 07:04 huyvd7

Thanks for @huyvd7 . I have trained a GLR model, I will try your method.

robotzheng avatar Apr 07 '20 07:04 robotzheng

hi @huyvd7 . Do you try resnet architecture for your deepGLR, add more direct connections?

robotzheng avatar Apr 07 '20 07:04 robotzheng

Hi @robotzheng, thanks for yor suggestions. However, this is an implementation of the work from Zeng et al., so I choose to respect the original architecture. You can try to change the architecture if it makes sense to you.

huyvd7 avatar Apr 07 '20 23:04 huyvd7

I have trained four GLR models, but, when I train a DGLR model using them, learning rate is 1e-6, the loss is only dropped to several thousand, the test result is worse than any GLR model. It is very strange, could you give me some clues for the hyperparameter tuning, maybe warmup etc.

[root@A01-R04-I220-17 pytorch-deepglr]# python3 validate_GLR.py dataset/test/ -m glr2 -w 324 -o ./ CUDA: True, device: cuda torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 18.170697450637817 PSNR: 24.232001993063406 SSIM: 0.8946895443993425 Saved ./0.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 36.70376658439636 PSNR: 26.772219231926385 SSIM: 0.7657313323251355 Saved ./1.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 55.09617733955383 PSNR: 27.688053671692337 SSIM: 0.7803515355510561 Saved ./2.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 73.81252861022949 PSNR: 28.366392689622725 SSIM: 0.7278776538706415 Saved ./3.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 92.69762015342712 PSNR: 28.987624907959052 SSIM: 0.8181714040228298 Saved ./4.png Mean PSNR: 27.209 Mean SSIM: 0.797 Total running time: 92.887

[root@A01-R04-I220-17 pytorch-deepglr]# python3 validate_DGLR.py dataset/test/ -m dglr1 -w 324 -o ./ CUDA: True, device: cuda torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 71.4286847114563 PSNR: 22.138853568026818 SSIM: 0.8053459041015122 Saved ./0.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 143.2804732322693 PSNR: 25.052257892718657 SSIM: 0.7207063204008689 Saved ./1.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 236.26569437980652 PSNR: 22.95616563387547 SSIM: 0.34679425042434203 Saved ./2.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 319.9859173297882 PSNR: 24.57338341125765 SSIM: 0.7547493039864405 Saved ./3.png torch.Size([3, 324, 324]) torch.Size([3, 324, 324]) torch.Size([9, 3, 36, 36]), 9/9 Prediction time: 392.2982153892517 PSNR: 24.29301121994221 SSIM: 0.35664995354226975 Saved ./4.png Mean PSNR: 23.803 Mean SSIM: 0.597 Total running time: 392.496

CUDA: True Total image patches: 405 load four glr models. Wed Apr 8 17:42:11 2020 [1] loss: 28699048.123, time elapsed: 91.15628266334534 Wed Apr 8 17:43:44 2020 [2] loss: 4380996947.936, time elapsed: 183.98294496536255 Wed Apr 8 17:45:17 2020 [3] loss: 3806.706, time elapsed: 276.8562080860138 Wed Apr 8 17:46:50 2020 [4] loss: 42765.303, time elapsed: 369.7352330684662 Wed Apr 8 17:48:22 2020 [5] loss: 6825798.181, time elapsed: 462.62313079833984 Wed Apr 8 17:49:55 2020 [6] loss: 9849.157, time elapsed: 555.1242043972015 Wed Apr 8 17:51:28 2020 [7] loss: 840.405, time elapsed: 648.2627387046814 Wed Apr 8 17:53:01 2020 [8] loss: 3580.285, time elapsed: 741.2946929931641 Wed Apr 8 17:54:34 2020 [9] loss: 770.831, time elapsed: 834.3132054805756 Wed Apr 8 17:56:07 2020 [10] loss: 22552.827, time elapsed: 927.3255677223206 save @ epoch 10 Wed Apr 8 17:57:40 2020 [11] loss: 719.561, time elapsed: 1020.2078688144684 Wed Apr 8 17:59:13 2020 [12] loss: 806.813, time elapsed: 1113.2309997081757 Wed Apr 8 18:00:46 2020 [13] loss: 1272.709, time elapsed: 1206.2352206707 Wed Apr 8 18:02:19 2020 [14] loss: 839.889, time elapsed: 1299.2292194366455 Wed Apr 8 18:03:52 2020 [15] loss: 37378.186, time elapsed: 1392.174908876419 Wed Apr 8 18:05:25 2020 [16] loss: 58270645.545, time elapsed: 1485.118041753769 Wed Apr 8 18:06:58 2020 [17] loss: 2305142461.112, time elapsed: 1578.0344014167786 Wed Apr 8 18:08:31 2020 [18] loss: 1352.420, time elapsed: 1670.8754751682281 Wed Apr 8 18:10:03 2020 [19] loss: 3489.405, time elapsed: 1763.0194175243378 Wed Apr 8 18:11:36 2020 [20] loss: 2182.005, time elapsed: 1855.939778327942 save @ epoch 20 Wed Apr 8 18:13:09 2020 [21] loss: 2464.128, time elapsed: 1948.9913835525513 Wed Apr 8 18:14:42 2020 [22] loss: 1568.485, time elapsed: 2041.987242937088 Wed Apr 8 18:16:15 2020 [23] loss: 3798.734, time elapsed: 2134.933037996292 Wed Apr 8 18:17:48 2020 [24] loss: 5297351.908, time elapsed: 2227.90717792511 Wed Apr 8 18:19:21 2020 [25] loss: 36039.352, time elapsed: 2320.851002931595 Wed Apr 8 18:20:54 2020 [26] loss: 81358.759, time elapsed: 2413.7696454524994 Wed Apr 8 18:22:27 2020 [27] loss: 821399.532, time elapsed: 2506.7561650276184 Wed Apr 8 18:23:59 2020 [28] loss: 4941.458, time elapsed: 2599.546965122223 Wed Apr 8 18:25:32 2020 [29] loss: 13569.867, time elapsed: 2692.538019180298 Wed Apr 8 18:27:05 2020 [30] loss: 2532.590, time elapsed: 2785.560569524765 save @ epoch 30 Wed Apr 8 18:28:38 2020 [31] loss: 4174.638, time elapsed: 2878.4395129680634 Wed Apr 8 18:30:11 2020 [32] loss: 2241.009, time elapsed: 2971.3129935264587 Wed Apr 8 18:31:44 2020 [33] loss: 12405305.918, time elapsed: 3064.1904554367065 Wed Apr 8 18:33:17 2020 [34] loss: 13022970.469, time elapsed: 3156.9640731811523 Wed Apr 8 18:34:49 2020 [35] loss: 26860.282, time elapsed: 3249.6265144348145 Wed Apr 8 18:36:22 2020 [36] loss: 14942.899, time elapsed: 3342.233377456665 Wed Apr 8 18:37:54 2020 [37] loss: 16221768.743, time elapsed: 3434.1890411376953 Wed Apr 8 18:39:26 2020 [38] loss: 23368.491, time elapsed: 3526.3723068237305 Wed Apr 8 18:40:59 2020 [39] loss: 2222.862, time elapsed: 3618.9984946250916 Wed Apr 8 18:42:32 2020 [40] loss: 1368.408, time elapsed: 3711.8353774547577 save @ epoch 40 Wed Apr 8 18:44:04 2020 [41] loss: 662623.621, time elapsed: 3804.5742955207825 Wed Apr 8 18:45:36 2020 [42] loss: 27884.571, time elapsed: 3896.4985785484314 Wed Apr 8 18:47:08 2020 [43] loss: 1543.571, time elapsed: 3988.4424555301666 Wed Apr 8 18:48:40 2020 [44] loss: 2530.772, time elapsed: 4080.28059053421 Wed Apr 8 18:50:12 2020 [45] loss: 238767.187, time elapsed: 4172.163241147995 Wed Apr 8 18:51:44 2020 [46] loss: 690.828, time elapsed: 4264.043545722961 Wed Apr 8 18:53:16 2020 [47] loss: 418.249, time elapsed: 4355.83909702301 Wed Apr 8 18:54:47 2020 [48] loss: 83807.180, time elapsed: 4447.655849695206 Wed Apr 8 18:56:19 2020 [49] loss: 340035.318, time elapsed: 4539.494091510773 Wed Apr 8 18:57:51 2020 [50] loss: 609.510, time elapsed: 4631.551383733749 save @ epoch 50 Wed Apr 8 18:59:24 2020 [51] loss: 341.841, time elapsed: 4724.327443361282 Wed Apr 8 19:00:57 2020 [52] loss: 358.379, time elapsed: 4817.00167798996 Wed Apr 8 19:02:29 2020 [53] loss: 4503.457, time elapsed: 4909.684851646423 Wed Apr 8 19:04:08 2020 [54] loss: 352.104, time elapsed: 5008.621166229248 Wed Apr 8 19:05:45 2020 [55] loss: 293.110, time elapsed: 5105.50665140152 Wed Apr 8 19:07:18 2020 [56] loss: 239.958, time elapsed: 5198.021396398544 Wed Apr 8 19:08:50 2020 [57] loss: 5855517981.070, time elapsed: 5290.533970594406 Wed Apr 8 19:10:21 2020 [58] loss: 404.903, time elapsed: 5381.366308927536 Wed Apr 8 19:11:54 2020 [59] loss: 25666.112, time elapsed: 5473.814644813538 Wed Apr 8 19:13:26 2020 [60] loss: 28479.964, time elapsed: 5566.246972560883 save @ epoch 60 Wed Apr 8 19:14:58 2020 [61] loss: 21870.992, time elapsed: 5658.585961103439 Wed Apr 8 19:16:31 2020 [62] loss: 53266.954, time elapsed: 5750.849793672562 Wed Apr 8 19:18:03 2020 [63] loss: 2972.962, time elapsed: 5843.433084964752 Wed Apr 8 19:19:46 2020 [64] loss: 60990.980, time elapsed: 5946.445726633072 Wed Apr 8 19:21:29 2020 [65] loss: 1267.569, time elapsed: 6049.459476232529 Wed Apr 8 19:23:12 2020 [66] loss: 12948.773, time elapsed: 6152.459672451019 Wed Apr 8 19:24:55 2020 [67] loss: 236066.217, time elapsed: 6255.46568274498 Wed Apr 8 19:26:38 2020 [68] loss: 289470.756, time elapsed: 6358.4380402565 Wed Apr 8 19:28:21 2020 [69] loss: 4067.074, time elapsed: 6461.43091750145 Wed Apr 8 19:30:04 2020 [70] loss: 5594934.839, time elapsed: 6564.39287686348 save @ epoch 70 Wed Apr 8 19:31:47 2020 [71] loss: 3351.521, time elapsed: 6667.383355140686 Wed Apr 8 19:33:15 2020 [72] loss: 7289.026, time elapsed: 6755.454119443893 Wed Apr 8 19:34:41 2020 [73] loss: 5547.603, time elapsed: 6841.032315254211 Wed Apr 8 19:36:12 2020 [74] loss: 9985.499, time elapsed: 6932.49100279808 Wed Apr 8 19:37:43 2020 [75] loss: 4113.392, time elapsed: 7022.986564159393 Wed Apr 8 19:39:14 2020 [76] loss: 9297.825, time elapsed: 7114.602356910706 Wed Apr 8 19:40:47 2020 [77] loss: 27624.502, time elapsed: 7206.997621297836 Wed Apr 8 19:42:19 2020 [78] loss: 107140111.528, time elapsed: 7299.419520139694 Wed Apr 8 19:43:52 2020 [79] loss: 337765.573, time elapsed: 7391.838447809219 Wed Apr 8 19:45:24 2020 [80] loss: 44800.208, time elapsed: 7484.345376729965 save @ epoch 80 Wed Apr 8 19:46:57 2020 [81] loss: 4164.298, time elapsed: 7576.760746955872 Wed Apr 8 19:48:29 2020 [82] loss: 1494.209, time elapsed: 7669.142573833466 Wed Apr 8 19:50:01 2020 [83] loss: 42806.365, time elapsed: 7761.454262018204 Wed Apr 8 19:51:34 2020 [84] loss: 179602.153, time elapsed: 7853.837611198425 Wed Apr 8 19:53:06 2020 [85] loss: 4865134.109, time elapsed: 7945.807110548019 Wed Apr 8 19:54:37 2020 [86] loss: 2172.896, time elapsed: 8037.489516019821 Wed Apr 8 19:56:09 2020 [87] loss: 1393.788, time elapsed: 8129.208279848099 Wed Apr 8 19:57:41 2020 [88] loss: 36531.240, time elapsed: 8220.899200677872 Wed Apr 8 19:59:12 2020 [89] loss: 3347.594, time elapsed: 8312.640513896942 Wed Apr 8 20:00:44 2020 [90] loss: 124948.865, time elapsed: 8404.413475990295 save @ epoch 90 Wed Apr 8 20:02:16 2020 [91] loss: 4626.535, time elapsed: 8496.072904586792 Wed Apr 8 20:03:48 2020 [92] loss: 320541871.836, time elapsed: 8587.857550144196 Wed Apr 8 20:05:19 2020 [93] loss: 14117702.825, time elapsed: 8679.63119482994 Wed Apr 8 20:06:51 2020 [94] loss: 6608.544, time elapsed: 8771.368431568146 Wed Apr 8 20:08:21 2020 [95] loss: 866.350, time elapsed: 8861.306810617447 Wed Apr 8 20:09:54 2020 [96] loss: 1144.876, time elapsed: 8953.770832300186 Wed Apr 8 20:11:26 2020 [97] loss: 3951.061, time elapsed: 9046.313138246536 Wed Apr 8 20:12:59 2020 [98] loss: 4428.774, time elapsed: 9138.744377851486 Wed Apr 8 20:14:31 2020 [99] loss: 49450.833, time elapsed: 9230.966096639633 Wed Apr 8 20:16:03 2020 [100] loss: 1166106.878, time elapsed: 9322.85964846611 save @ epoch 100 Wed Apr 8 20:17:34 2020 [101] loss: 8528.943, time elapsed: 9414.703340053558 Wed Apr 8 20:19:06 2020 [102] loss: 196585.189, time elapsed: 9506.529341220856 Wed Apr 8 20:20:38 2020 [103] loss: 814.915, time elapsed: 9598.167202472687 Wed Apr 8 20:22:10 2020 [104] loss: 1102.341, time elapsed: 9689.869136571884 Wed Apr 8 20:23:41 2020 [105] loss: 944234.108, time elapsed: 9781.60225200653 Wed Apr 8 20:25:13 2020 [106] loss: 1749.746, time elapsed: 9873.32220196724 Wed Apr 8 20:26:42 2020 [107] loss: 2424.887, time elapsed: 9961.997639894485 Wed Apr 8 20:28:18 2020 [108] loss: 442.122, time elapsed: 10057.93766283989 Wed Apr 8 20:30:01 2020 [109] loss: 808.784, time elapsed: 10160.895748376846 Wed Apr 8 20:31:44 2020 [110] loss: 224771.204, time elapsed: 10263.803529977798 save @ epoch 110 Wed Apr 8 20:33:26 2020 [111] loss: 344.856, time elapsed: 10366.67761015892 Wed Apr 8 20:35:09 2020 [112] loss: 5775.507, time elapsed: 10469.502502202988 Wed Apr 8 20:36:52 2020 [113] loss: 16578.309, time elapsed: 10572.330513715744 Wed Apr 8 20:38:35 2020 [114] loss: 217025.124, time elapsed: 10675.158657550812 Wed Apr 8 20:40:18 2020 [115] loss: 12033.736, time elapsed: 10778.006921291351 Wed Apr 8 20:41:51 2020 [116] loss: 51086301.529, time elapsed: 10870.740975856781 Wed Apr 8 20:43:22 2020 [117] loss: 961.639, time elapsed: 10962.399029493332 Wed Apr 8 20:44:51 2020 [118] loss: 144963.168, time elapsed: 11051.018603563309 Wed Apr 8 20:46:16 2020 [119] loss: 9741.859, time elapsed: 11136.463475465775 Wed Apr 8 20:47:44 2020 [120] loss: 1899.726, time elapsed: 11224.605322122574 save @ epoch 120 Wed Apr 8 20:49:16 2020 [121] loss: 1379.419, time elapsed: 11316.594392299652 Wed Apr 8 20:50:48 2020 [122] loss: 1048.602, time elapsed: 11408.46797132492 Wed Apr 8 20:52:20 2020 [123] loss: 908.252, time elapsed: 11500.350952148438 Wed Apr 8 20:53:52 2020 [124] loss: 17436.160, time elapsed: 11592.320937395096 Wed Apr 8 20:55:24 2020 [125] loss: 1353.661, time elapsed: 11684.254593610764 Wed Apr 8 20:56:56 2020 [126] loss: 5654715.804, time elapsed: 11776.155443668365 Wed Apr 8 20:58:28 2020 [127] loss: 21398309.314, time elapsed: 11868.006656169891 Wed Apr 8 21:00:00 2020 [128] loss: 17267161.659, time elapsed: 11959.816833257675 Wed Apr 8 21:01:31 2020 [129] loss: 24633.209, time elapsed: 12051.547986030579 Wed Apr 8 21:03:03 2020 [130] loss: 5707.376, time elapsed: 12143.300612926483 save @ epoch 130 Wed Apr 8 21:04:35 2020 [131] loss: 20343.917, time elapsed: 12235.098524332047 Wed Apr 8 21:06:07 2020 [132] loss: 10868.699, time elapsed: 12326.815776586533 Wed Apr 8 21:07:38 2020 [133] loss: 16778.450, time elapsed: 12418.597933292389 Wed Apr 8 21:09:10 2020 [134] loss: 706438425.542, time elapsed: 12510.390802145004 Wed Apr 8 21:10:42 2020 [135] loss: 784.414, time elapsed: 12602.133189439774 Wed Apr 8 21:12:13 2020 [136] loss: 3233.413, time elapsed: 12693.60379242897 Wed Apr 8 21:13:43 2020 [137] loss: 138018966.796, time elapsed: 12783.30856847763 Wed Apr 8 21:15:15 2020 [138] loss: 54978182.322, time elapsed: 12875.469939470291 Wed Apr 8 21:16:47 2020 [139] loss: 13026304068.340, time elapsed: 12967.359166383743 Wed Apr 8 21:18:15 2020 [140] loss: 3841.076, time elapsed: 13055.36816740036 save @ epoch 140 Wed Apr 8 21:19:48 2020 [141] loss: 275.196, time elapsed: 13147.781096935272 Wed Apr 8 21:21:20 2020 [142] loss: 2466.706, time elapsed: 13240.204514980316 Wed Apr 8 21:22:52 2020 [143] loss: 2074.647, time elapsed: 13332.594478607178 Wed Apr 8 21:24:25 2020 [144] loss: 12669.192, time elapsed: 13425.154649972916 Wed Apr 8 21:25:58 2020 [145] loss: 9678.925, time elapsed: 13517.95210981369 Wed Apr 8 21:27:30 2020 [146] loss: 9649.783, time elapsed: 13610.475712299347 Wed Apr 8 21:29:03 2020 [147] loss: 8745.278, time elapsed: 13702.893873929977 Wed Apr 8 21:30:35 2020 [148] loss: 3122.006, time elapsed: 13795.33949637413 Wed Apr 8 21:32:07 2020 [149] loss: 6734.710, time elapsed: 13887.704800128937 Wed Apr 8 21:33:40 2020 [150] loss: 300453.795, time elapsed: 13980.066706180573 save @ epoch 150 Wed Apr 8 21:35:12 2020 [151] loss: 5732738739.101, time elapsed: 14072.448618888855 Wed Apr 8 21:36:45 2020 [152] loss: 12931.162, time elapsed: 14164.829667806625 Wed Apr 8 21:38:17 2020 [153] loss: 4574.494, time elapsed: 14257.225501060486 Wed Apr 8 21:39:49 2020 [154] loss: 23840.875, time elapsed: 14349.61429309845 Wed Apr 8 21:41:22 2020 [155] loss: 43790008.620, time elapsed: 14442.014100790024 Wed Apr 8 21:42:54 2020 [156] loss: 46215.610, time elapsed: 14534.36073756218 Wed Apr 8 21:44:27 2020 [157] loss: 164779.682, time elapsed: 14626.765259981155 Wed Apr 8 21:45:59 2020 [158] loss: 4258.095, time elapsed: 14719.117589950562 Wed Apr 8 21:47:31 2020 [159] loss: 920.604, time elapsed: 14811.505740880966 Wed Apr 8 21:48:59 2020 [160] loss: 7249.924, time elapsed: 14899.224462032318 save @ epoch 160 Wed Apr 8 21:50:24 2020 [161] loss: 6583.588, time elapsed: 14984.41795182228 Wed Apr 8 21:51:53 2020 [162] loss: 10062504.244, time elapsed: 15073.241025447845 Wed Apr 8 21:53:25 2020 [163] loss: 3505247.772, time elapsed: 15165.581409215927 Wed Apr 8 21:54:58 2020 [164] loss: 1093.882, time elapsed: 15257.95053768158 Wed Apr 8 21:56:30 2020 [165] loss: 34798.859, time elapsed: 15350.281907320023 Wed Apr 8 21:58:02 2020 [166] loss: 6151.933, time elapsed: 15442.626341342926 Wed Apr 8 21:59:35 2020 [167] loss: 301071.719, time elapsed: 15534.956279993057 Wed Apr 8 22:01:07 2020 [168] loss: 2678.903, time elapsed: 15627.2756254673 Wed Apr 8 22:02:39 2020 [169] loss: 1417.803, time elapsed: 15719.6351044178 Wed Apr 8 22:04:12 2020 [170] loss: 3189063.584, time elapsed: 15811.963452339172 save @ epoch 170 Wed Apr 8 22:05:44 2020 [171] loss: 281770111.528, time elapsed: 15904.207070350647 Wed Apr 8 22:07:16 2020 [172] loss: 2335.198, time elapsed: 15996.033240318298 Wed Apr 8 22:08:48 2020 [173] loss: 77521.198, time elapsed: 16088.388737678528 Wed Apr 8 22:10:20 2020 [174] loss: 126979732.842, time elapsed: 16180.687359571457 Wed Apr 8 22:11:53 2020 [175] loss: 2332.836, time elapsed: 16273.028414964676 Wed Apr 8 22:13:24 2020 [176] loss: 2247.767, time elapsed: 16364.17386174202 Wed Apr 8 22:14:53 2020 [177] loss: 73530866.189, time elapsed: 16453.310804367065 Wed Apr 8 22:16:24 2020 [178] loss: 4509537.983, time elapsed: 16544.63923740387 Wed Apr 8 22:17:49 2020 [179] loss: 18217637.052, time elapsed: 16629.619928598404 Wed Apr 8 22:19:20 2020 [180] loss: 422969.882, time elapsed: 16720.211461782455 save @ epoch 180 Wed Apr 8 22:20:51 2020 [181] loss: 8202797.891, time elapsed: 16811.29319214821 Wed Apr 8 22:22:23 2020 [182] loss: 34510734.666, time elapsed: 16903.519106388092 Wed Apr 8 22:23:56 2020 [183] loss: 20387343.391, time elapsed: 16995.850140810013 Wed Apr 8 22:25:28 2020 [184] loss: 5021543.849, time elapsed: 17088.160405158997 Wed Apr 8 22:27:00 2020 [185] loss: 315514364.810, time elapsed: 17180.450793266296 Wed Apr 8 22:28:31 2020 [186] loss: 1672779.228, time elapsed: 17271.00333762169 Wed Apr 8 22:29:56 2020 [187] loss: 909.750, time elapsed: 17356.41486644745 Wed Apr 8 22:31:22 2020 [188] loss: 3368.796, time elapsed: 17441.86795949936 Wed Apr 8 22:32:47 2020 [189] loss: 8240042.180, time elapsed: 17527.275011062622 Wed Apr 8 22:34:18 2020 [190] loss: 952142.315, time elapsed: 17618.110630989075 save @ epoch 190 Wed Apr 8 22:35:50 2020 [191] loss: 3206.983, time elapsed: 17710.538067102432 Wed Apr 8 22:37:23 2020 [192] loss: 909.231, time elapsed: 17802.951418161392 Wed Apr 8 22:38:58 2020 [193] loss: 701.097, time elapsed: 17897.998318195343 Wed Apr 8 22:40:41 2020 [194] loss: 5139.965, time elapsed: 18000.969222307205 Wed Apr 8 22:42:23 2020 [195] loss: 1131.920, time elapsed: 18102.922669887543 Wed Apr 8 22:44:06 2020 [196] loss: 9250.234, time elapsed: 18205.852262735367 Wed Apr 8 22:45:48 2020 [197] loss: 4428.385, time elapsed: 18308.697818756104 Wed Apr 8 22:47:31 2020 [198] loss: 1454.249, time elapsed: 18411.597235441208 Wed Apr 8 22:49:12 2020 [199] loss: 4756.356, time elapsed: 18512.2371635437 Wed Apr 8 22:50:37 2020 [200] loss: 4778.187, time elapsed: 18597.111175060272 save @ epoch 200 Total running time: 18597.149

robotzheng avatar Apr 10 '20 01:04 robotzheng