NiftyNet icon indicating copy to clipboard operation
NiftyNet copied to clipboard

running slower and slower ?

Open yanwu1 opened this issue 7 years ago • 8 comments

Hi,

I tried to run demo models. One iteration took a few seconds in the beginning, and a few hundred of seconds after an hour of running. Any idea of what happened?

Thanks a lot.

yanwu1 avatar Feb 06 '18 04:02 yanwu1

Could you provide some code that will reproduce this bug? What were you running and what was in the .ini file you used?

Zach-ER avatar Feb 06 '18 11:02 Zach-ER

We were using the demos config with PROMISE12 dataset, i.e. "net_segment train -c promise12_demo_train_config.ini". The machine is using Ubuntu 16.04, python 3.5.4, GTX 1070. The package is pip installed

$ conda list |grep -i nifty
NiftyNet                  0.2.2                     <pip>

The ini file

[promise12]
path_to_search = dev/DATA/PROMISE12/training_data
filename_contains = Case,mhd
filename_not_contains = Case2,segmentation
spatial_window_size = (64, 64, 64)
interp_order = 3
axcodes=(A, R, S)

[label]
path_to_search = dev/DATA/PROMISE12/training_data
filename_contains = Case,_segmentation,mhd
filename_not_contains = Case2
spatial_window_size = (64, 64, 64)
interp_order = 0
axcodes=(A, R, S)

############################## system configuration sections
[SYSTEM]
cuda_devices = ""
num_threads = 2
num_gpus = 1
model_dir = models/promise12

[NETWORK]
name = dense_vnet
activation_function = prelu
batch_size = 1

# volume level preprocessing
volume_padding_size = 0
# histogram normalisation
histogram_ref_file = standardisation_models.txt
norm_type = percentile
cutoff = (0.01, 0.99)
normalisation = True
whitening = True
normalise_foreground_only=True
foreground_type = otsu_plus
multimod_foreground_type = and
window_sampling = resize

queue_length = 8

[TRAINING]
sample_per_volume = 4
rotation_angle = (-10.0, 10.0)
scaling_percentage = (-10.0, 10.0)
random_flipping_axes= 1
lr = 0.00002
loss_type = Dice
starting_iter = 0
save_every_n = 12500
max_iter = 25000
max_checkpoints = 20

############################ custom configuration sections
[SEGMENTATION]
image = promise12
label = label
output_prob = False
num_classes = 2
label_normalisation = True
min_numb_labels = 2
min_sampling_ratio = 0.000001

From tensorboard chart, one can see that it slows down from time to time, e.g. 6-9AM, 3-6PM. tf_log

The runtime log is as below. It seems some iterations are slow. There were no other jobs running on the machine. The CPU usage is high all the time. We suspect it could from data augmentation.

log file:

...

INFO:niftynet:2018-02-04 05:00:33,204: Training iter 3112, loss=0.37107598781585693 (14.242115s)
INFO:niftynet:2018-02-04 05:01:31,755: Training iter 3113, loss=0.30466264486312866 (58.551371s)
INFO:niftynet:2018-02-04 05:01:40,929: Training iter 3114, loss=0.37340623140335083 (9.173162s)
INFO:niftynet:2018-02-04 05:02:25,169: Training iter 3115, loss=0.33443301916122437 (44.239202s)
INFO:niftynet:2018-02-04 05:02:45,410: Training iter 3116, loss=0.30701494216918945 (20.240570s)
INFO:niftynet:2018-02-04 05:04:11,077: Training iter 3117, loss=0.33207225799560547 (85.666615s)
INFO:niftynet:2018-02-04 05:04:11,866: Training iter 3118, loss=0.3369055986404419 (0.789394s)
INFO:niftynet:2018-02-04 05:04:13,119: Training iter 3119, loss=0.4060615301132202 (1.252192s)
INFO:niftynet:2018-02-04 05:06:21,964: Training iter 3120, loss=0.43466782569885254 (128.844309s)
INFO:niftynet:2018-02-04 05:06:33,036: Training iter 3121, loss=0.2989329695701599 (11.071814s)
INFO:niftynet:2018-02-04 05:08:13,050: Training iter 3122, loss=0.36755990982055664 (100.013732s)
INFO:niftynet:2018-02-04 05:08:16,841: Training iter 3123, loss=0.33224594593048096 (3.790390s)
INFO:niftynet:2018-02-04 05:10:04,096: Training iter 3124, loss=0.3755572438240051 (107.254568s)
INFO:niftynet:2018-02-04 05:10:05,747: Training iter 3125, loss=0.38522469997406006 (1.651027s)
INFO:niftynet:2018-02-04 05:10:08,195: Training iter 3126, loss=0.4287583827972412 (2.446937s)
INFO:niftynet:2018-02-04 05:10:22,958: Training iter 3127, loss=0.3286384344100952 (14.763304s)
INFO:niftynet:2018-02-04 05:12:31,518: Training iter 3128, loss=0.3387739658355713 (128.559851s)
INFO:niftynet:2018-02-04 05:12:38,741: Training iter 3129, loss=0.37458348274230957 (7.222261s)
INFO:niftynet:2018-02-04 05:13:36,876: Training iter 3130, loss=0.3135569095611572 (58.134953s)
INFO:niftynet:2018-02-04 05:13:40,133: Training iter 3131, loss=0.3529359698295593 (3.256537s)
INFO:niftynet:2018-02-04 05:13:43,034: Training iter 3132, loss=0.3446159362792969 (2.900526s)
INFO:niftynet:2018-02-04 05:13:45,638: Training iter 3133, loss=0.33163565397262573 (2.603957s)
INFO:niftynet:2018-02-04 05:13:47,740: Training iter 3134, loss=0.3361837863922119 (2.101159s)
INFO:niftynet:2018-02-04 05:13:58,995: Training iter 3135, loss=0.37792855501174927 (11.254240s)
INFO:niftynet:2018-02-04 05:14:42,743: Training iter 3136, loss=0.3373921513557434 (43.748390s)
INFO:niftynet:2018-02-04 05:15:02,632: Training iter 3137, loss=0.3403393626213074 (19.888261s)
INFO:niftynet:2018-02-04 05:15:06,495: Training iter 3138, loss=0.26892024278640747 (3.862314s)
INFO:niftynet:2018-02-04 05:15:41,367: Training iter 3139, loss=0.3844781517982483 (34.871233s)
INFO:niftynet:2018-02-04 05:16:12,515: Training iter 3140, loss=0.252951979637146 (31.147876s)
INFO:niftynet:2018-02-04 05:16:16,808: Training iter 3141, loss=0.40091514587402344 (4.292411s)
INFO:niftynet:2018-02-04 05:16:19,190: Training iter 3142, loss=0.3135116696357727 (2.382251s)
INFO:niftynet:2018-02-04 05:16:45,683: Training iter 3143, loss=0.33606576919555664 (26.492328s)
INFO:niftynet:2018-02-04 05:17:46,796: Training iter 3144, loss=0.36048412322998047 (61.113036s)
INFO:niftynet:2018-02-04 05:17:48,378: Training iter 3145, loss=0.29509061574935913 (1.580717s)
INFO:niftynet:2018-02-04 05:20:19,689: Training iter 3146, loss=0.35901862382888794 (151.311214s)
INFO:niftynet:2018-02-04 05:20:24,885: Training iter 3147, loss=0.31175869703292847 (5.195914s)
INFO:niftynet:2018-02-04 05:20:29,165: Training iter 3148, loss=0.2760446071624756 (4.279379s)
INFO:niftynet:2018-02-04 05:20:34,029: Training iter 3149, loss=0.3915928602218628 (4.863679s)
INFO:niftynet:2018-02-04 05:22:22,000: Training iter 3150, loss=0.4051785469055176 (107.969888s)
INFO:niftynet:2018-02-04 05:22:25,146: Training iter 3151, loss=0.31178414821624756 (3.145937s)
INFO:niftynet:2018-02-04 05:22:34,333: Training iter 3152, loss=0.354453444480896 (9.186936s)
INFO:niftynet:2018-02-04 05:24:08,897: Training iter 3153, loss=0.3638203740119934 (94.563101s)
INFO:niftynet:2018-02-04 05:24:51,947: Training iter 3154, loss=0.4660397171974182 (43.049718s)
INFO:niftynet:2018-02-04 05:26:30,054: Training iter 3155, loss=0.3705129623413086 (98.107050s)
INFO:niftynet:2018-02-04 05:26:36,323: Training iter 3156, loss=0.31395673751831055 (6.268041s)
INFO:niftynet:2018-02-04 05:26:39,645: Training iter 3157, loss=0.4561350345611572 (3.321751s)
INFO:niftynet:2018-02-04 05:27:14,898: Training iter 3158, loss=0.37726694345474243 (35.252799s)
INFO:niftynet:2018-02-04 05:27:21,865: Training iter 3159, loss=0.3682054877281189 (6.966398s)
INFO:niftynet:2018-02-04 05:28:41,427: Training iter 3160, loss=0.44507360458374023 (79.561367s)
INFO:niftynet:2018-02-04 05:28:44,359: Training iter 3161, loss=0.326259970664978 (2.931988s)
INFO:niftynet:2018-02-04 05:29:18,675: Training iter 3162, loss=0.41969871520996094 (34.315881s)
INFO:niftynet:2018-02-04 05:29:25,339: Training iter 3163, loss=0.29812759160995483 (6.663559s)
INFO:niftynet:2018-02-04 05:30:36,847: Training iter 3164, loss=0.352571964263916 (71.507353s)
INFO:niftynet:2018-02-04 05:30:37,595: Training iter 3165, loss=0.36614125967025757 (0.747251s)
INFO:niftynet:2018-02-04 05:31:55,785: Training iter 3166, loss=0.3596130609512329 (78.189229s)
INFO:niftynet:2018-02-04 05:32:00,167: Training iter 3167, loss=0.32823312282562256 (4.381795s)
INFO:niftynet:2018-02-04 05:32:33,486: Training iter 3168, loss=0.42757493257522583 (33.318413s)
INFO:niftynet:2018-02-04 05:34:02,772: Training iter 3169, loss=0.32123011350631714 (89.285458s)
INFO:niftynet:2018-02-04 05:34:06,398: Training iter 3170, loss=0.45248788595199585 (3.625087s)
INFO:niftynet:2018-02-04 05:34:21,645: Training iter 3171, loss=0.24756717681884766 (15.246397s)
INFO:niftynet:2018-02-04 05:36:33,541: Training iter 3172, loss=0.35753607749938965 (131.896121s)
INFO:niftynet:2018-02-04 05:36:37,097: Training iter 3173, loss=0.3531233072280884 (3.555879s)
INFO:niftynet:2018-02-04 05:36:39,913: Training iter 3174, loss=0.31457221508026123 (2.815682s)
INFO:niftynet:2018-02-04 05:36:40,680: Training iter 3175, loss=0.3850466012954712 (0.766757s)
INFO:niftynet:2018-02-04 05:38:38,645: Training iter 3176, loss=0.3737855553627014 (117.964742s)
INFO:niftynet:2018-02-04 05:39:03,697: Training iter 3177, loss=0.2529038190841675 (25.051330s)
INFO:niftynet:2018-02-04 05:39:06,079: Training iter 3178, loss=0.31511157751083374 (2.381510s)
INFO:niftynet:2018-02-04 05:39:10,134: Training iter 3179, loss=0.39967888593673706 (4.054745s)
INFO:niftynet:2018-02-04 05:41:17,204: Training iter 3180, loss=0.2961346507072449 (127.069678s)
INFO:niftynet:2018-02-04 05:41:28,663: Training iter 3181, loss=0.4498443603515625 (11.457977s)
INFO:niftynet:2018-02-04 05:43:07,923: Training iter 3182, loss=0.3496859669685364 (99.259753s)
INFO:niftynet:2018-02-04 05:43:47,061: Training iter 3183, loss=0.33152270317077637 (39.137043s)
INFO:niftynet:2018-02-04 05:45:29,358: Training iter 3184, loss=0.34254682064056396 (102.297114s)
INFO:niftynet:2018-02-04 05:46:08,972: Training iter 3185, loss=0.37956470251083374 (39.613671s)
INFO:niftynet:2018-02-04 05:47:46,622: Training iter 3186, loss=0.3334100842475891 (97.649066s)
INFO:niftynet:2018-02-04 05:48:20,715: Training iter 3187, loss=0.35623109340667725 (34.092308s)
INFO:niftynet:2018-02-04 05:50:08,737: Training iter 3188, loss=0.3457310199737549 (108.022139s)
INFO:niftynet:2018-02-04 05:50:56,804: Training iter 3189, loss=0.30419886112213135 (48.066392s)
INFO:niftynet:2018-02-04 05:51:00,891: Training iter 3190, loss=0.3093259334564209 (4.086583s)
INFO:niftynet:2018-02-04 05:51:05,667: Training iter 3191, loss=0.36458611488342285 (4.775456s)
INFO:niftynet:2018-02-04 05:51:09,776: Training iter 3192, loss=0.3194471597671509 (4.108997s)
INFO:niftynet:2018-02-04 05:52:49,692: Training iter 3193, loss=0.3613234758377075 (99.913957s)
INFO:niftynet:2018-02-04 05:53:31,460: Training iter 3194, loss=0.39149928092956543 (41.768167s)
INFO:niftynet:2018-02-04 05:55:14,305: Training iter 3195, loss=0.324373722076416 (102.844351s)
INFO:niftynet:2018-02-04 05:55:37,406: Training iter 3196, loss=0.39397990703582764 (23.100681s)
INFO:niftynet:2018-02-04 05:57:40,000: Training iter 3197, loss=0.361843466758728 (122.594049s)
INFO:niftynet:2018-02-04 05:57:57,155: Training iter 3198, loss=0.3906646966934204 (17.154123s)
INFO:niftynet:2018-02-04 05:59:41,774: Training iter 3199, loss=0.3295634388923645 (104.619388s)
INFO:niftynet:2018-02-04 05:59:46,338: Training iter 3200, loss=0.22172576189041138 (4.563224s)
INFO:niftynet:2018-02-04 06:00:36,766: Training iter 3201, loss=0.40439850091934204 (50.427679s)
INFO:niftynet:2018-02-04 06:00:42,709: Training iter 3202, loss=0.36596059799194336 (5.942629s)
INFO:niftynet:2018-02-04 06:02:02,607: Training iter 3203, loss=0.3764580488204956 (79.898031s)
INFO:niftynet:2018-02-04 06:02:51,451: Training iter 3204, loss=0.45148003101348877 (48.843117s)
INFO:niftynet:2018-02-04 06:04:29,131: Training iter 3205, loss=0.2574602961540222 (97.679569s)
INFO:niftynet:2018-02-04 06:05:23,023: Training iter 3206, loss=0.3119458556175232 (53.891906s)
INFO:niftynet:2018-02-04 06:06:40,857: Training iter 3207, loss=0.3018069863319397 (77.833536s)
INFO:niftynet:2018-02-04 06:06:53,172: Training iter 3208, loss=0.2596304416656494 (12.314524s)
INFO:niftynet:2018-02-04 06:06:58,747: Training iter 3209, loss=0.37166810035705566 (5.574576s)
INFO:niftynet:2018-02-04 06:07:01,132: Training iter 3210, loss=0.3726741075515747 (2.384812s)
INFO:niftynet:2018-02-04 06:07:02,482: Training iter 3211, loss=0.4551212191581726 (1.349622s)
INFO:niftynet:2018-02-04 06:07:04,463: Training iter 3212, loss=0.3341611623764038 (1.981073s)
INFO:niftynet:2018-02-04 06:07:09,727: Training iter 3213, loss=0.45620471239089966 (5.262965s)
INFO:niftynet:2018-02-04 06:07:31,139: Training iter 3214, loss=0.329740047454834 (21.411524s)
INFO:niftynet:2018-02-04 06:07:34,860: Training iter 3215, loss=0.3178499937057495 (3.720585s)
INFO:niftynet:2018-02-04 06:07:40,004: Training iter 3216, loss=0.36263757944107056 (5.143972s)
INFO:niftynet:2018-02-04 06:07:42,995: Training iter 3217, loss=0.3526430130004883 (2.990682s)
INFO:niftynet:2018-02-04 06:07:46,750: Training iter 3218, loss=0.3731982707977295 (3.754132s)
INFO:niftynet:2018-02-04 06:07:50,594: Training iter 3219, loss=0.2938324213027954 (3.843937s)
INFO:niftynet:2018-02-04 06:07:54,617: Training iter 3220, loss=0.23273944854736328 (4.022754s)
INFO:niftynet:2018-02-04 06:07:58,565: Training iter 3221, loss=0.3634291887283325 (3.947105s)
INFO:niftynet:2018-02-04 06:09:02,375: Training iter 3222, loss=0.3502005934715271 (63.809136s)
INFO:niftynet:2018-02-04 06:10:17,726: Training iter 3223, loss=0.28568869829177856 (75.350587s)
INFO:niftynet:2018-02-04 06:11:33,106: Training iter 3224, loss=0.33960264921188354 (75.380043s)
INFO:niftynet:2018-02-04 06:11:40,878: Training iter 3225, loss=0.30811697244644165 (7.770978s)
INFO:niftynet:2018-02-04 06:12:21,702: Training iter 3226, loss=0.3617017865180969 (40.823450s)
INFO:niftynet:2018-02-04 06:13:06,006: Training iter 3227, loss=0.3623029589653015 (44.303903s)
INFO:niftynet:2018-02-04 06:13:10,073: Training iter 3228, loss=0.3605882525444031 (4.066298s)
INFO:niftynet:2018-02-04 06:13:14,929: Training iter 3229, loss=0.4044530391693115 (4.855924s)
INFO:niftynet:2018-02-04 06:13:18,860: Training iter 3230, loss=0.38412976264953613 (3.930265s)
INFO:niftynet:2018-02-04 06:13:22,202: Training iter 3231, loss=0.3329523801803589 (3.341710s)
INFO:niftynet:2018-02-04 06:13:25,515: Training iter 3232, loss=0.30099356174468994 (3.312572s)
INFO:niftynet:2018-02-04 06:13:29,756: Training iter 3233, loss=0.38142162561416626 (4.240382s)
INFO:niftynet:2018-02-04 06:13:33,951: Training iter 3234, loss=0.19206523895263672 (4.194987s)
INFO:niftynet:2018-02-04 06:13:38,151: Training iter 3235, loss=0.40177416801452637 (4.199735s)
INFO:niftynet:2018-02-04 06:14:23,650: Training iter 3236, loss=0.30711883306503296 (45.498394s)
INFO:niftynet:2018-02-04 06:14:28,202: Training iter 3237, loss=0.3050880432128906 (4.552024s)
INFO:niftynet:2018-02-04 06:14:43,215: Training iter 3238, loss=0.32294952869415283 (15.011946s)
INFO:niftynet:2018-02-04 06:14:47,469: Training iter 3239, loss=0.37166672945022583 (4.254439s)
INFO:niftynet:2018-02-04 06:15:18,144: Training iter 3240, loss=0.35803234577178955 (30.673959s)
INFO:niftynet:2018-02-04 06:15:21,169: Training iter 3241, loss=0.38414567708969116 (3.024954s)
INFO:niftynet:2018-02-04 06:15:26,273: Training iter 3242, loss=0.20106416940689087 (5.103477s)
INFO:niftynet:2018-02-04 06:15:44,681: Training iter 3243, loss=0.45381438732147217 (18.407668s)
INFO:niftynet:2018-02-04 06:15:55,206: Training iter 3244, loss=0.45816147327423096 (10.524818s)
INFO:niftynet:2018-02-04 06:17:34,558: Training iter 3245, loss=0.3450808525085449 (99.351294s)
INFO:niftynet:2018-02-04 06:17:35,833: Training iter 3246, loss=0.36030423641204834 (1.274569s)
INFO:niftynet:2018-02-04 06:18:09,591: Training iter 3247, loss=0.3944627046585083 (33.757943s)
INFO:niftynet:2018-02-04 06:18:16,504: Training iter 3248, loss=0.34504562616348267 (6.911891s)
INFO:niftynet:2018-02-04 06:19:55,830: Training iter 3249, loss=0.310807466506958 (99.326094s)
INFO:niftynet:2018-02-04 06:20:04,087: Training iter 3250, loss=0.29225218296051025 (8.256889s)
INFO:niftynet:2018-02-04 06:20:56,482: Training iter 3251, loss=0.30756092071533203 (52.394215s)
INFO:niftynet:2018-02-04 06:20:59,762: Training iter 3252, loss=0.38828516006469727 (3.279202s)
INFO:niftynet:2018-02-04 06:22:12,593: Training iter 3253, loss=0.2999073266983032 (72.831080s)
INFO:niftynet:2018-02-04 06:22:16,161: Training iter 3254, loss=0.3179304599761963 (3.568010s)
INFO:niftynet:2018-02-04 06:22:54,960: Training iter 3255, loss=0.33782631158828735 (38.797640s)
INFO:niftynet:2018-02-04 06:23:00,691: Training iter 3256, loss=0.23302197456359863 (5.731607s)
INFO:niftynet:2018-02-04 06:23:25,205: Training iter 3257, loss=0.37238794565200806 (24.513435s)
INFO:niftynet:2018-02-04 06:23:30,058: Training iter 3258, loss=0.33794671297073364 (4.852362s)
INFO:niftynet:2018-02-04 06:25:23,384: Training iter 3259, loss=0.3415184020996094 (113.325167s)
INFO:niftynet:2018-02-04 06:25:24,973: Training iter 3260, loss=0.32149529457092285 (1.588773s)
INFO:niftynet:2018-02-04 06:25:26,349: Training iter 3261, loss=0.3000342845916748 (1.376515s)
INFO:niftynet:2018-02-04 06:25:27,177: Training iter 3262, loss=0.3487347364425659 (0.826794s)
INFO:niftynet:2018-02-04 06:27:21,757: Training iter 3263, loss=0.4114413857460022 (114.580435s)
INFO:niftynet:2018-02-04 06:27:32,390: Training iter 3264, loss=0.4223620891571045 (10.631936s)
INFO:niftynet:2018-02-04 06:27:54,420: Training iter 3265, loss=0.31552964448928833 (22.030401s)
INFO:niftynet:2018-02-04 06:27:59,099: Training iter 3266, loss=0.38719117641448975 (4.678531s)
INFO:niftynet:2018-02-04 06:28:01,817: Training iter 3267, loss=0.3859798312187195 (2.717255s)
INFO:niftynet:2018-02-04 06:28:06,038: Training iter 3268, loss=0.2845779061317444 (4.220529s)
INFO:niftynet:2018-02-04 06:30:05,697: Training iter 3269, loss=0.34881770610809326 (119.658882s)
INFO:niftynet:2018-02-04 06:30:31,068: Training iter 3270, loss=0.2866211533546448 (25.369788s)
INFO:niftynet:2018-02-04 06:32:27,183: Training iter 3271, loss=0.45742106437683105 (116.115514s)
INFO:niftynet:2018-02-04 06:32:52,256: Training iter 3272, loss=0.32958269119262695 (25.071903s)
INFO:niftynet:2018-02-04 06:34:53,627: Training iter 3273, loss=0.3918994665145874 (121.370547s)
INFO:niftynet:2018-02-04 06:35:07,340: Training iter 3274, loss=0.3327748775482178 (13.713155s)
INFO:niftynet:2018-02-04 06:35:09,854: Training iter 3275, loss=0.3571701645851135 (2.513450s)
INFO:niftynet:2018-02-04 06:35:14,191: Training iter 3276, loss=0.3478344678878784 (4.336564s)
INFO:niftynet:2018-02-04 06:35:17,657: Training iter 3277, loss=0.33886802196502686 (3.465799s)
INFO:niftynet:2018-02-04 06:37:12,743: Training iter 3278, loss=0.33371686935424805 (115.085766s)
INFO:niftynet:2018-02-04 06:37:52,863: Training iter 3279, loss=0.3267320394515991 (40.119349s)
INFO:niftynet:2018-02-04 06:38:03,444: Training iter 3280, loss=0.39515119791030884 (10.580738s)
INFO:niftynet:2018-02-04 06:38:08,533: Training iter 3281, loss=0.39593005180358887 (5.087229s)
INFO:niftynet:2018-02-04 06:38:55,054: Training iter 3282, loss=0.2926979064941406 (46.520983s)
INFO:niftynet:2018-02-04 06:39:46,942: Training iter 3283, loss=0.26046520471572876 (51.887233s)
INFO:niftynet:2018-02-04 06:40:48,004: Training iter 3284, loss=0.33731329441070557 (61.061883s)
INFO:niftynet:2018-02-04 06:40:52,419: Training iter 3285, loss=0.3965139389038086 (4.414739s)
INFO:niftynet:2018-02-04 06:40:58,106: Training iter 3286, loss=0.3300483226776123 (5.686168s)
INFO:niftynet:2018-02-04 06:41:29,659: Training iter 3287, loss=0.3648573160171509 (31.552723s)
INFO:niftynet:2018-02-04 06:41:33,914: Training iter 3288, loss=0.3814588189125061 (4.255000s)
INFO:niftynet:2018-02-04 06:43:27,288: Training iter 3289, loss=0.38389337062835693 (113.373166s)
INFO:niftynet:2018-02-04 06:43:33,369: Training iter 3290, loss=0.35278820991516113 (6.081323s)
INFO:niftynet:2018-02-04 06:44:37,468: Training iter 3291, loss=0.26125091314315796 (64.098298s)
INFO:niftynet:2018-02-04 06:45:43,937: Training iter 3292, loss=0.3171950578689575 (66.468516s)
INFO:niftynet:2018-02-04 06:47:08,380: Training iter 3293, loss=0.36618542671203613 (84.442773s)
INFO:niftynet:2018-02-04 06:47:15,213: Training iter 3294, loss=0.23651033639907837 (6.832281s)
INFO:niftynet:2018-02-04 06:47:17,952: Training iter 3295, loss=0.3154541850090027 (2.738967s)
INFO:niftynet:2018-02-04 06:47:19,132: Training iter 3296, loss=0.3524249196052551 (1.179875s)
INFO:niftynet:2018-02-04 06:47:40,426: Training iter 3297, loss=0.39479655027389526 (21.294134s)
INFO:niftynet:2018-02-04 06:49:29,152: Training iter 3298, loss=0.228204607963562 (108.725102s)
INFO:niftynet:2018-02-04 06:49:30,042: Training iter 3299, loss=0.37164533138275146 (0.890049s)
INFO:niftynet:2018-02-04 06:49:36,257: Training iter 3300, loss=0.38100993633270264 (6.213617s)
INFO:niftynet:2018-02-04 06:49:41,018: Training iter 3301, loss=0.34930336475372314 (4.760895s)
INFO:niftynet:2018-02-04 06:49:49,518: Training iter 3302, loss=0.2950022220611572 (8.499232s)
INFO:niftynet:2018-02-04 06:49:53,824: Training iter 3303, loss=0.34184563159942627 (4.305547s)
INFO:niftynet:2018-02-04 06:51:26,664: Training iter 3304, loss=0.38591206073760986 (92.840185s)
INFO:niftynet:2018-02-04 06:52:00,924: Training iter 3305, loss=0.3709667921066284 (34.259344s)
INFO:niftynet:2018-02-04 06:53:30,118: Training iter 3306, loss=0.39874720573425293 (89.193628s)
INFO:niftynet:2018-02-04 06:54:38,635: Training iter 3307, loss=0.3880261182785034 (68.516998s)
INFO:niftynet:2018-02-04 06:56:32,618: Training iter 3308, loss=0.3447127938270569 (113.982489s)
INFO:niftynet:2018-02-04 06:57:12,661: Training iter 3309, loss=0.344002366065979 (40.042485s)
INFO:niftynet:2018-02-04 06:57:18,416: Training iter 3310, loss=0.35473883152008057 (5.754361s)
INFO:niftynet:2018-02-04 06:58:04,165: Training iter 3311, loss=0.30252134799957275 (45.748982s)
INFO:niftynet:2018-02-04 06:58:20,667: Training iter 3312, loss=0.38298726081848145 (16.501662s)
INFO:niftynet:2018-02-04 06:59:53,244: Training iter 3313, loss=0.37479448318481445 (92.576876s)
INFO:niftynet:2018-02-04 06:59:59,512: Training iter 3314, loss=0.4118778705596924 (6.266973s)
INFO:niftynet:2018-02-04 07:01:16,088: Training iter 3315, loss=0.37281399965286255 (76.576032s)
INFO:niftynet:2018-02-04 07:02:16,970: Training iter 3316, loss=0.3765242099761963 (60.881196s)
INFO:niftynet:2018-02-04 07:02:45,361: Training iter 3317, loss=0.3546115756034851 (28.390307s)
INFO:niftynet:2018-02-04 07:02:50,755: Training iter 3318, loss=0.36963313817977905 (5.394172s)


yanwu1 avatar Feb 06 '18 17:02 yanwu1

Thanks for the update @yanwu1 , internally the .mhd files are converted into NifTI format using the SimpleITKAsNibabel interface (https://github.com/NifTK/NiftyNet/blob/v0.2.2/niftynet/io/simple_itk_as_nibabel.py#L14). The way the objects cached potentially requires large memory. This would be further optimised in the next release. At the moment converting the images into NIfTI format beforehand might mitigate the issue.

wyli avatar Feb 06 '18 18:02 wyli

It should be fixed by the new image_loader https://github.com/NifTK/NiftyNet/commit/c9a55b4929e2727d8e83f45eb7fbd98acc4ce468. please feel free to reopen this issue if the problem persists.

wyli avatar Mar 07 '18 16:03 wyli

First: Thank you for creating NiftyNet! It is exactly what is needed and it is a pleasure to use!

It looks like I am experiencing a similar issue as stated above. It occurs in various ini settings (including using 1 GPU or multiple GPUs, large or small queue, github niftynet or pip install, input image sizes 144x144x144 or 256x256x80, always denseVnet).

After a couple of hundred iterations the iteration time goes up by a factor of 10. The machine has a 6 core CPU, 64 gigs RAM, 4x Titan Xp. I have a sample ini and output below. It would be super if you could point to any direction on how to solve this. For now my only idea is kill and restart the process automatically every couple of hours - which is not great.

Thank you very much for your help!
Veit

############################ input configuration sections [ctNC] path_to_search = /home/user/Kidneyproject/trainCTCMix/preprocessed filename_contains = WITHOUT spatial_window_size = (256, 256, 80) interp_order = 1 axcodes=(A, R, S)

[FATMAP] path_to_search = /home/user/Kidneyproject/trainCTCMix/preprocessed filename_contains = FATMAP spatial_window_size = (256, 256, 80) interp_order = 1 axcodes=(A, R, S)

[label] path_to_search = /home/user/Kidneyproject/trainCTCMix/preprocessed filename_contains = KIDNEYLABELS spatial_window_size = (256, 256, 80) interp_order = 0 axcodes=(A, R, S)

############################## system configuration sections [SYSTEM] cuda_devices = "" num_threads = 4 num_gpus = 4 model_dir = models/kidneyFATMAP queue_length = 2

[NETWORK] name = dense_vnet batch_size = 2

histogram_ref_file = ./model/standardisation_models.txt norm_type = percentile cutoff = (0.01, 0.99) normalisation = True whitening = True activation_function=prelu window_sampling = resize keep_prob = 0.5

[TRAINING] sample_per_volume = 1 lr = 0.001 loss_type = dense_vnet_abdominal_ct.dice_hinge.dice starting_iter = -1 save_every_n = 100 max_iter = 100000 #tensorboard_every_n = 10

[INFERENCE] border = (16, 16, 16) inference_iter = -1 output_interp_order = 0 spatial_window_size = (144, 144, 144) save_seg_dir = ./segmentation_output/

############################ custom configuration sections [SEGMENTATION] image = ctNC, FATMAP label = label sampler = label label_normalisation = False output_prob = False num_classes = 2


Log:

~/NiftyNet$ python net_segment.py -c kidneyXL.ini train CRITICAL:tensorflow:Optional Python module cv2 not found, please install cv2 and retry if the application fails. NiftyNet version 0.3.0+296.ge8e0297.dirty [CTNC] -- axcodes: ('A', 'R', 'S') -- interp_order: 1 -- csv_file: -- path_to_search: /home/user/Kidneyproject/trainCTCMix/preprocessed -- spatial_window_size: (256, 256, 80) -- filename_contains: ('WITHOUT',) -- pixdim: () -- loader: None -- filename_not_contains: () [LABEL] -- axcodes: ('A', 'R', 'S') -- interp_order: 0 -- csv_file: -- path_to_search: /home/user/Kidneyproject/trainCTCMix/preprocessed -- spatial_window_size: (256, 256, 80) -- filename_contains: ('KIDNEYLABELS',) -- pixdim: () -- loader: None -- filename_not_contains: () [CONFIG_FILE] -- path: /home/user/NiftyNet/kidneyXL.ini [CUSTOM] -- proba_connect: True -- label: ('label',) -- label_normalisation: False -- softmax: True -- num_classes: 2 -- evaluation_units: foreground -- rand_samples: 0 -- inferred: () -- min_sampling_ratio: 0 -- name: net_segment -- sampler: ('label',) -- output_prob: False -- image: ('ctNC', 'FATMAP') -- compulsory_labels: (0, 1) -- weight: () -- min_numb_labels: 1 [FATMAP] -- axcodes: ('A', 'R', 'S') -- interp_order: 1 -- csv_file: -- path_to_search: /home/user/Kidneyproject/trainCTCMix/preprocessed -- spatial_window_size: (256, 256, 80) -- filename_contains: ('FATMAP',) -- pixdim: () -- loader: None -- filename_not_contains: () [NETWORK] -- decay: 0.0 -- queue_length: 5 -- batch_size: 2 -- normalisation: True -- bias_initializer_args: {} -- foreground_type: otsu_plus -- bias_initializer: zeros -- reg_type: L2 -- histogram_ref_file: ./model/standardisation_models.txt -- whitening: True -- norm_type: percentile -- name: dense_vnet -- weight_initializer: he_normal -- multimod_foreground_type: and -- cutoff: (0.01, 0.99) -- keep_prob: 0.5 -- weight_initializer_args: {} -- activation_function: prelu -- window_sampling: resize -- volume_padding_size: (0, 0, 0) -- volume_padding_mode: minimum -- normalise_foreground_only: False [SYSTEM] -- action: training -- num_threads: 4 -- queue_length: 2 -- num_gpus: 4 -- model_dir: /home/user/NiftyNet/models/kidneyFATMAP -- event_handler: ('model_saver', 'model_restorer', 'sampler_threading', 'apply_gradients', 'output_interpreter', 'console_logger', 'tensorboard_logger') -- cuda_devices: "" -- dataset_split_file: ./dataset_split.csv -- iteration_generator: iteration_generator [INFERENCE] -- output_postfix: _niftynet_out -- output_interp_order: 0 -- dataset_to_infer: -- border: (16, 16, 16) -- spatial_window_size: (144, 144, 144) -- inference_iter: -1 -- save_seg_dir: ./segmentation_output/ [TRAINING] -- random_flipping_axes: -1 -- deformation_sigma: 15 -- rotation_angle_x: () -- validation_every_n: -1 -- optimiser: adam -- do_elastic_deformation: False -- validation_max_iter: 1 -- starting_iter: -1 -- save_every_n: 100 -- tensorboard_every_n: 10 -- max_iter: 100000 -- loss_type: dense_vnet_abdominal_ct.dice_hinge.dice -- rotation_angle: () -- proportion_to_deform: 0.5 -- scaling_percentage: () -- exclude_fraction_for_inference: 0.0 -- rotation_angle_y: () -- lr: 0.001 -- max_checkpoints: 100 -- num_ctrl_points: 4 -- sample_per_volume: 1 -- rotation_angle_z: () -- exclude_fraction_for_validation: 0.0 INFO:niftynet: set initial_iter to 616 based on checkpoints INFO:niftynet: starting segmentation application INFO:niftynet: csv_file = not found, writing to "/home/user/NiftyNet/models/kidneyFATMAP/ctNC.csv" instead. INFO:niftynet: Overwriting existing: "/home/user/NiftyNet/models/kidneyFATMAP/ctNC.csv". INFO:niftynet: [ctNC] search file folders, writing csv file /home/user/NiftyNet/models/kidneyFATMAP/ctNC.csv INFO:niftynet: csv_file = not found, writing to "/home/user/NiftyNet/models/kidneyFATMAP/label.csv" instead. INFO:niftynet: Overwriting existing: "/home/user/NiftyNet/models/kidneyFATMAP/label.csv". INFO:niftynet: [label] search file folders, writing csv file /home/user/NiftyNet/models/kidneyFATMAP/label.csv INFO:niftynet: csv_file = not found, writing to "/home/user/NiftyNet/models/kidneyFATMAP/FATMAP.csv" instead. INFO:niftynet: Overwriting existing: "/home/user/NiftyNet/models/kidneyFATMAP/FATMAP.csv". INFO:niftynet: [FATMAP] search file folders, writing csv file /home/user/NiftyNet/models/kidneyFATMAP/FATMAP.csv INFO:niftynet:

Number of subjects 186, input section names: ['subject_id', 'ctNC', 'label', 'FATMAP'] -- using all subjects (without data partitioning).

INFO:niftynet: Image reader: loading 186 subjects from sections ('ctNC', 'FATMAP') as input [image] INFO:niftynet: Image reader: loading 186 subjects from sections ('label',) as input [label] INFO:niftynet: Image reader: loading 186 subjects from sections ('label',) as input [sampler] INFO:niftynet: normalisation histogram reference models ready for image:('ctNC', 'FATMAP') 2018-08-21 15:56:48.505522: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA 2018-08-21 15:56:51.412365: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1105] Found device 0 with properties: name: TITAN X (Pascal) major: 6 minor: 1 memoryClockRate(GHz): 1.531 pciBusID: 0000:05:00.0 totalMemory: 11.90GiB freeMemory: 11.74GiB 2018-08-21 15:56:51.640495: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1105] Found device 1 with properties: name: TITAN X (Pascal) major: 6 minor: 1 memoryClockRate(GHz): 1.531 pciBusID: 0000:06:00.0 totalMemory: 11.90GiB freeMemory: 11.75GiB 2018-08-21 15:56:51.865962: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1105] Found device 2 with properties: name: TITAN X (Pascal) major: 6 minor: 1 memoryClockRate(GHz): 1.531 pciBusID: 0000:09:00.0 totalMemory: 11.90GiB freeMemory: 11.75GiB 2018-08-21 15:56:52.089807: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1105] Found device 3 with properties: name: TITAN X (Pascal) major: 6 minor: 1 memoryClockRate(GHz): 1.531 pciBusID: 0000:0a:00.0 totalMemory: 11.90GiB freeMemory: 11.75GiB 2018-08-21 15:56:52.093953: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1120] Device peer to peer matrix 2018-08-21 15:56:52.094067: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1126] DMA: 0 1 2 3 2018-08-21 15:56:52.094076: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1136] 0: Y Y Y Y 2018-08-21 15:56:52.094083: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1136] 1: Y Y Y Y 2018-08-21 15:56:52.094092: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1136] 2: Y Y Y Y 2018-08-21 15:56:52.094099: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1136] 3: Y Y Y Y 2018-08-21 15:56:52.094109: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:0) -> (device: 0, name: TITAN X (Pascal), pci bus id: 0000:05:00.0, compute capability: 6.1) 2018-08-21 15:56:52.094117: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:1) -> (device: 1, name: TITAN X (Pascal), pci bus id: 0000:06:00.0, compute capability: 6.1) 2018-08-21 15:56:52.094125: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:2) -> (device: 2, name: TITAN X (Pascal), pci bus id: 0000:09:00.0, compute capability: 6.1) 2018-08-21 15:56:52.094132: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:3) -> (device: 3, name: TITAN X (Pascal), pci bus id: 0000:0a:00.0, compute capability: 6.1) INFO:niftynet: reading size of preprocessed images INFO:niftynet: initialised window instance INFO:niftynet: buffering with 5 windows INFO:niftynet: initialised sampler output {'label': (1, 256, 256, 80, 1, 1), 'sampler': (1, 256, 256, 80, 1, 1), 'sampler_location': (1, 7), 'image': (1, 256, 256, 80, 1, 2), 'image_location': (1, 7), 'label_location': (1, 7)} INFO:niftynet: using DenseVNet 2018-08-21 15:56:53.003951: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:0) -> (device: 0, name: TITAN X (Pascal), pci bus id: 0000:05:00.0, compute capability: 6.1) 2018-08-21 15:56:53.003969: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:1) -> (device: 1, name: TITAN X (Pascal), pci bus id: 0000:06:00.0, compute capability: 6.1) 2018-08-21 15:56:53.003978: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:2) -> (device: 2, name: TITAN X (Pascal), pci bus id: 0000:09:00.0, compute capability: 6.1) 2018-08-21 15:56:53.003984: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:3) -> (device: 3, name: TITAN X (Pascal), pci bus id: 0000:0a:00.0, compute capability: 6.1) INFO:niftynet: Import [dice] from /home/user/niftynet/extensions/dense_vnet_abdominal_ct/dice_hinge.py. 2018-08-21 15:56:57.702001: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:0) -> (device: 0, name: TITAN X (Pascal), pci bus id: 0000:05:00.0, compute capability: 6.1) 2018-08-21 15:56:57.702022: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:1) -> (device: 1, name: TITAN X (Pascal), pci bus id: 0000:06:00.0, compute capability: 6.1) 2018-08-21 15:56:57.702031: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:2) -> (device: 2, name: TITAN X (Pascal), pci bus id: 0000:09:00.0, compute capability: 6.1) 2018-08-21 15:56:57.702039: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:3) -> (device: 3, name: TITAN X (Pascal), pci bus id: 0000:0a:00.0, compute capability: 6.1) INFO:niftynet: Import [dice] from /home/user/niftynet/extensions/dense_vnet_abdominal_ct/dice_hinge.py. 2018-08-21 15:57:01.709489: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:0) -> (device: 0, name: TITAN X (Pascal), pci bus id: 0000:05:00.0, compute capability: 6.1) 2018-08-21 15:57:01.709508: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:1) -> (device: 1, name: TITAN X (Pascal), pci bus id: 0000:06:00.0, compute capability: 6.1) 2018-08-21 15:57:01.709517: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:2) -> (device: 2, name: TITAN X (Pascal), pci bus id: 0000:09:00.0, compute capability: 6.1) 2018-08-21 15:57:01.709524: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:3) -> (device: 3, name: TITAN X (Pascal), pci bus id: 0000:0a:00.0, compute capability: 6.1) INFO:niftynet: Import [dice] from /home/user/niftynet/extensions/dense_vnet_abdominal_ct/dice_hinge.py. 2018-08-21 15:57:05.636534: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:0) -> (device: 0, name: TITAN X (Pascal), pci bus id: 0000:05:00.0, compute capability: 6.1) 2018-08-21 15:57:05.636555: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:1) -> (device: 1, name: TITAN X (Pascal), pci bus id: 0000:06:00.0, compute capability: 6.1) 2018-08-21 15:57:05.636566: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:2) -> (device: 2, name: TITAN X (Pascal), pci bus id: 0000:09:00.0, compute capability: 6.1) 2018-08-21 15:57:05.636573: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:3) -> (device: 3, name: TITAN X (Pascal), pci bus id: 0000:0a:00.0, compute capability: 6.1) INFO:niftynet: Import [dice] from /home/user/niftynet/extensions/dense_vnet_abdominal_ct/dice_hinge.py. 2018-08-21 15:57:18.721653: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:0) -> (device: 0, name: TITAN X (Pascal), pci bus id: 0000:05:00.0, compute capability: 6.1) 2018-08-21 15:57:18.721676: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:1) -> (device: 1, name: TITAN X (Pascal), pci bus id: 0000:06:00.0, compute capability: 6.1) 2018-08-21 15:57:18.721686: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:2) -> (device: 2, name: TITAN X (Pascal), pci bus id: 0000:09:00.0, compute capability: 6.1) 2018-08-21 15:57:18.721693: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1195] Creating TensorFlow device (/device:GPU:3) -> (device: 3, name: TITAN X (Pascal), pci bus id: 0000:0a:00.0, compute capability: 6.1) INFO:niftynet: Starting preprocessing threads... INFO:niftynet: New thread: 0 INFO:niftynet: New thread: 1 INFO:niftynet: New thread: 2 INFO:niftynet: New thread: 3 INFO:niftynet: filling queues (this can take a few minutes). INFO:niftynet: starting from iter 616 INFO:niftynet: Accessing /home/user/NiftyNet/models/kidneyFATMAP/models/model.ckpt-616 INFO:niftynet: Restoring parameters from /home/user/NiftyNet/models/kidneyFATMAP/models/model.ckpt-616 dice[0.996741652 0.80557555] dice[0.998921335 0.774603367] dice[0.99671334 0.797460854] dice[0.996096969 0.101980627] dice[0.997946739 0.452415854] dice[0.997963905 0.552981] dice[0.997029066 0.560121775] dice[0.998157918 0.750509202] INFO:niftynet: training iter 617, loss_1=0.10603952407836914, loss_3=0.24967312812805176, loss=0.2769370675086975, loss_2=0.17354550957679749 (54.644354s) dice[0.998689651 0.685896397] dice[0.998377919 0.672367334] dice[0.996768594 0.609998047] dice[0.998441637 0.704396427] dice[0.998390615 0.669347644] dice[0.997215688 0.435593188] dice[0.998896122 0.743184686] dice[0.998168528 0.37471354] INFO:niftynet: training iter 618, loss_1=0.161167174577713, loss_3=0.18137377500534058, loss=0.22125926613807678, loss_2=0.21608826518058777 (14.678665s) dice[0.998432934 0.723949432] dice[0.997374117 0.64112848] dice[0.998133779 0.581470907] dice[0.997255445 0.46253249] dice[0.998705 0.649114251] dice[0.997999132 0.741589308] dice[0.998310745 0.734942377] dice[0.998504162 0.785690188] INFO:niftynet: training iter 619, loss_1=0.12063813209533691, loss_3=0.15977877378463745, loss=0.20015588402748108, loss_2=0.1931440234184265 (18.208534s) dice[0.99849689 0.583864748] dice[0.998975 0.64653331] dice[0.998065472 0.68249625] dice[0.99761 0.651011229] dice[0.996536791 0.769921303] dice[0.997917354 0.738396406] dice[0.99832046 0.70481956] dice[0.997879148 0.650157034] INFO:niftynet: training iter 620, loss_1=0.13824504613876343, loss_3=0.15376624464988708, loss=0.16220593452453613, loss_2=0.19303250312805176 (37.207527s) dice[0.999297917 0.835989118] dice[0.998387 0.776239216] dice[0.997889876 0.657220483] dice[0.998048723 0.673022866] dice[0.996732056 0.772380412] dice[0.998097 0.496416748] dice[0.9982602 0.655703902] dice[0.996544063 0.484351486] INFO:niftynet: training iter 621, loss_1=0.1840934455394745, loss_3=0.09752166271209717, loss=0.2162851095199585, loss_2=0.16845452785491943 (16.948675s) dice[0.999003947 0.828900039] dice[0.997344434 0.605645239] dice[0.998835862 0.765547156] dice[0.998221278 0.655442] dice[0.996679902 0.43125996] dice[0.998961806 0.822924197] dice[0.997424781 0.678419173] dice[0.997099757 0.584315538] INFO:niftynet: training iter 622, loss_1=0.1422765851020813, loss_3=0.18754351139068604, loss=0.1856851875782013, loss_2=0.1454884111881256 (17.208762s) dice[0.997762203 0.663958311] dice[0.99609828 0.522140443] dice[0.995967209 0.806678176] dice[0.997764289 0.632448] dice[0.996771574 0.487873375] dice[0.99799633 0.549898684] dice[0.996708274 0.761217654] dice[0.998265266 0.776920199] INFO:niftynet: training iter 623, loss_1=0.14178556203842163, loss_3=0.20501017570495605, loss=0.11672213673591614, loss_2=0.24186503887176514 (17.181811s) dice[0.996234596 0.581172466] dice[0.995564 0.488454252] dice[0.994317472 0.577338338] dice[0.99837333 0.807378113] dice[0.996544719 0.610612571] dice[0.998085618 0.608858526] dice[0.998499751 0.793136179] dice[0.998225927 0.679240644] INFO:niftynet: training iter 624, loss_1=0.23464366793632507, loss_3=0.1467728316783905, loss=0.1327243447303772, loss_2=0.20535004138946533 (17.847119s) dice[0.996177614 0.565288603] dice[0.998161197 0.656307] dice[0.995921612 0.771404743] dice[0.996998429 0.728565216] dice[0.998504698 0.649988353] dice[0.994875491 0.487141877] dice[0.998167574 0.709541082] dice[0.998428226 0.646001339] INFO:niftynet: training iter 625, loss_1=0.1267774999141693, loss_3=0.21737238764762878, loss=0.16196545958518982, loss_2=0.19601640105247498 (17.567361s) dice[0.998792827 0.809821] dice[0.997554302 0.383606523] dice[0.997642 0.712812304] dice[0.997848094 0.709086716] dice[0.998378456 0.755713165] dice[0.996756673 0.594817638] dice[0.995618761 0.597335458] dice[0.998741508 0.832226157] INFO:niftynet: training iter 626, loss_1=0.14565274119377136, loss_3=0.16358351707458496, loss=0.14401954412460327, loss_2=0.20255634188652039 (17.349879s) dice[0.998291612 0.7158705] dice[0.998446703 0.73411715] dice[0.998572 0.840685129] dice[0.997889936 0.695926249] dice[0.998515666 0.763996] dice[0.998512864 0.857207239] dice[0.998336911 0.834442496] dice[0.998686 0.79561007] INFO:niftynet: training iter 627, loss_1=0.1383185088634491, loss_3=0.09544205665588379, loss=0.09323114156723022, loss_2=0.1167316734790802 (16.771186s) dice[0.997298598 0.673391521] dice[0.998571 0.704079211] dice[0.99748522 0.68503511] dice[0.998233616 0.74320823] dice[0.997242272 0.664093494] dice[0.998759031 0.703384399] dice[0.996831715 0.580562949] dice[0.997943461 0.548452437] INFO:niftynet: training iter 628, loss_1=0.15913021564483643, loss_3=0.14400944113731384, loss=0.15666493773460388, loss_2=0.21905234456062317 (16.501701s) dice[0.997142732 0.678366661] dice[0.999019682 0.845859945] dice[0.998480618 0.821916103] dice[0.998879671 0.818962395] dice[0.998461783 0.848249137] dice[0.99785161 0.695387423] dice[0.998048067 0.803284705] dice[0.998521566 0.743144929] INFO:niftynet: training iter 629, loss_1=0.115012526512146, loss_3=0.09044033288955688, loss=0.11425018310546875, loss_2=0.11990272998809814 (18.293662s) dice[0.998994589 0.733797133] dice[0.997977912 0.775973856] dice[0.996879816 0.529975176] dice[0.998102665 0.74016881] dice[0.996737 0.643184304] dice[0.998746634 0.838262] dice[0.997625351 0.720141351] dice[0.998415709 0.767433941] INFO:niftynet: training iter 630, loss_1=0.13076749444007874, loss_3=0.18371838331222534, loss=0.1290959119796753, loss_2=0.12331414222717285 (18.141016s) dice[0.996557772 0.781311154] dice[0.998513162 0.826122344] dice[0.997957587 0.749334931] dice[0.998355 0.797063112] dice[0.996542513 0.41614908] dice[0.998852849 0.795142293] dice[0.998545527 0.357435167] dice[0.997040868 0.526758909] INFO:niftynet: training iter 631, loss_1=0.09937387704849243, loss_3=0.11432236433029175, loss=0.19832831621170044, loss_2=0.28005489706993103 (18.851755s) dice[0.998317897 0.71754] dice[0.999016464 0.868289232] dice[0.998970926 0.844398141] dice[0.997749865 0.57673794] dice[0.998188436 0.672818661] dice[0.998226821 0.583851933] dice[0.99903518 0.868132591] dice[0.9986341 0.696398616] INFO:niftynet: training iter 632, loss_1=0.1042090654373169, loss_3=0.14553579688072205, loss=0.10944989323616028, loss_2=0.18672853708267212 (17.740794s) dice[0.998611152 0.752316058] dice[0.998578429 0.665687203] dice[0.998234034 0.764792383] dice[0.997506142 0.582078755] dice[0.998206675 0.751471221] dice[0.99848485 0.725339293] dice[0.998085916 0.716279268] dice[0.998110056 0.695915222] INFO:niftynet: training iter 633, loss_1=0.14620178937911987, loss_3=0.1316244900226593, loss=0.16434717178344727, loss_2=0.14790236949920654 (17.809727s) dice[0.997798145 0.695195735] dice[0.998065352 0.639306664] dice[0.998594165 0.779278576] dice[0.999006212 0.792541385] dice[0.998560548 0.832665384] dice[0.998651206 0.802539647] dice[0.998848498 0.707038581] dice[0.999002814 0.817277491] INFO:niftynet: training iter 634, loss_1=0.1674085259437561, loss_3=0.09189581871032715, loss=0.1194581389427185, loss_2=0.10764491558074951 (18.060388s) dice[0.998844624 0.820432067] dice[0.999194 0.771867096] dice[0.995559 0.532177567] dice[0.998480558 0.633478582] dice[0.998743296 0.7445] dice[0.998203635 0.704489887] dice[0.997134686 0.538578272] dice[0.99898994 0.77256918] INFO:niftynet: training iter 635, loss_1=0.102415531873703, loss_3=0.17318198084831238, loss=0.1561993956565857, loss_2=0.192392498254776 (17.511609s) dice[0.997897089 0.763034105] dice[0.998884737 0.83763808] dice[0.998675 0.692994654] dice[0.998473108 0.777081728] dice[0.998452544 0.708992541] dice[0.997044444 0.535990953] dice[0.997999728 0.803225875] dice[0.99892813 0.823822856] INFO:niftynet: training iter 636, loss_1=0.10063648223876953, loss_3=0.1898798644542694, loss=0.13319385051727295, loss_2=0.09400582313537598 (17.469091s) dice[0.997787416 0.659648776] dice[0.998124 0.726869881] dice[0.998652518 0.846716] dice[0.99717617 0.735065281] dice[0.998691201 0.849821687] dice[0.997618258 0.605772495] dice[0.998651624 0.785766482] dice[0.998995841 0.817796767] INFO:niftynet: training iter 637, loss_1=0.15439248085021973, loss_3=0.13702410459518433, loss=0.10559752583503723, loss_2=0.09969732165336609 (17.099817s) dice[0.997132361 0.590573907] dice[0.99812752 0.692110062] dice[0.999069035 0.82257086] dice[0.998634458 0.821208239] dice[0.997903764 0.314920753] dice[0.998651445 0.733362] dice[0.997980475 0.672627687] dice[0.998564959 0.831330299] INFO:niftynet: training iter 638, loss_1=0.08962935209274292, loss_3=0.12487414479255676, loss=0.23879051208496094, loss_2=0.18051403760910034 (17.682070s) dice[0.998916328 0.808838308] dice[0.998346269 0.797593] dice[0.997679293 0.701856852] dice[0.99811852 0.692759693] dice[0.998188078 0.784433722] dice[0.99775368 0.59918797] dice[0.997892082 0.733925462] dice[0.998069644 0.813611448] INFO:niftynet: training iter 639, loss_1=0.11412537097930908, loss_3=0.0990765392780304, loss=0.15510910749435425, loss_2=0.15239644050598145 (17.004204s) dice[0.997029126 0.597530365] dice[0.996626854 0.575157046] dice[0.998153687 0.731943488] dice[0.998165965 0.783431] dice[0.995093 0.723843336] dice[0.998087585 0.746161103] dice[0.997635 0.623117924] dice[0.99895215 0.865970731] INFO:niftynet: training iter 640, loss_1=0.12207648158073425, loss_3=0.12858104705810547, loss=0.13420376181602478, loss_2=0.20841416716575623 (17.836910s) dice[0.997423708 0.71058619] dice[0.998750806 0.856840253] dice[0.998016953 0.662237704] dice[0.997958302 0.725802064] dice[0.998175383 0.737307608] dice[0.998163223 0.711773336] dice[0.997408628 0.699] dice[0.998801351 0.813206077] INFO:niftynet: training iter 641, loss_1=0.15399622917175293, loss_3=0.10909974575042725, loss=0.1228959858417511, loss_2=0.13864511251449585 (17.557700s) dice[0.995452285 0.748598814] dice[0.997968316 0.709684789] dice[0.99760884 0.5855847] dice[0.998217046 0.544988096] dice[0.998688698 0.862852514] dice[0.996438444 0.553863466] dice[0.998422265 0.780849874] dice[0.998841584 0.838676751] INFO:niftynet: training iter 642, loss_1=0.13707396388053894, loss_3=0.14703920483589172, loss=0.09580239653587341, loss_2=0.21840032935142517 (17.334868s) dice[0.9958992 0.690490127] dice[0.997930527 0.670038819] dice[0.99827 0.744594216] dice[0.998416 0.671519] dice[0.998680711 0.728434384] dice[0.996539593 0.531439483] dice[0.995870769 0.553237] dice[0.995924294 0.512203693] INFO:niftynet: training iter 643, loss_1=0.16141033172607422, loss_3=0.18622642755508423, loss=0.1468002200126648, loss_2=0.23569107055664062 (18.051994s) dice[0.99818933 0.674074888] dice[0.998256087 0.746426404] dice[0.99815309 0.761510253] dice[0.998397291 0.740664482] dice[0.997879088 0.630613923] dice[0.998365462 0.463837057] dice[0.997639477 0.760302544] dice[0.99638027 0.42181921] INFO:niftynet: training iter 644, loss_1=0.22732609510421753, loss_3=0.1457633078098297, loss=0.20596462488174438, loss_2=0.125318706035614 (17.603481s) dice[0.996074736 0.553638101] dice[0.999025404 0.820516825] dice[0.998978555 0.779936612] dice[0.998021603 0.572984636] dice[0.999007881 0.833188653] dice[0.998474598 0.785178721] dice[0.996477962 0.733842194] dice[0.998755455 0.835749] INFO:niftynet: training iter 645, loss_1=0.162519633769989, loss_3=0.09603756666183472, loss=0.10879385471343994, loss_2=0.1576862335205078 (16.195631s) dice[0.99765259 0.680142462] dice[0.99800247 0.685771942] dice[0.999240577 0.871494114] dice[0.997430801 0.535710812] dice[0.99814111 0.678800225] dice[0.998998463 0.823958337] dice[0.998042583 0.729069352] dice[0.998493314 0.771663904] INFO:niftynet: training iter 646, loss_1=0.1490309238433838, loss_3=0.15960761904716492, loss=0.12568271160125732, loss_2=0.1250254511833191 (17.306653s) dice[0.999184787 0.854764163] dice[0.998733401 0.833137] dice[0.99891454 0.788018107] dice[0.99825877 0.702283859] dice[0.99882865 0.757797778] dice[0.997544229 0.635530472] dice[0.997612059 0.699321568] dice[0.996603787 0.37648] INFO:niftynet: training iter 647, loss_1=0.07854518294334412, loss_3=0.15257471799850464, loss=0.12813115119934082, loss_2=0.23249563574790955 (18.368294s) dice[0.998817563 0.853823066] dice[0.9976511 0.723970532] dice[0.998511791 0.777545452] dice[0.998819292 0.743608296] dice[0.998459041 0.705718756] dice[0.9975034 0.687872291] dice[0.997896731 0.73710376] dice[0.998259246 0.71818471] INFO:niftynet: training iter 648, loss_1=0.13713887333869934, loss_3=0.12037879228591919, loss=0.1526116132736206, loss_2=0.10643443465232849 (17.720635s) dice[0.998778164 0.741314828] dice[0.997358859 0.555444956] dice[0.999050558 0.844771862] dice[0.99760896 0.634373784] dice[0.998861551 0.817284644] dice[0.999004364 0.819556177] dice[0.998602808 0.823418677] dice[0.998268425 0.802631497] INFO:niftynet: training iter 649, loss_1=0.09132331609725952, loss_3=0.13104867935180664, loss=0.09426963329315186, loss_2=0.17677581310272217 (17.282309s) dice[0.99909085 0.842811763] dice[0.998896241 0.80465] dice[0.998999894 0.812768817] dice[0.999211252 0.821131051] dice[0.99883908 0.841535628] dice[0.996536613 0.566919863] dice[0.99914 0.878065705] dice[0.998642266 0.651115537] INFO:niftynet: training iter 650, loss_1=0.14904218912124634, loss_3=0.0919722318649292, loss=0.11825910210609436, loss_2=0.08863779902458191 (18.200011s) dice[0.998927534 0.757858694] dice[0.998495281 0.802840531] dice[0.996487081 0.618062258] dice[0.997826695 0.606682301] dice[0.996526718 0.399897814] dice[0.998682737 0.661678791] dice[0.998013854 0.614933312] dice[0.99922967 0.851354778] INFO:niftynet: training iter 651, loss_1=0.19523540139198303, loss_3=0.235803484916687, loss=0.13411706686019897, loss_2=0.11046949028968811 (16.732625s) dice[0.998805 0.379931659] dice[0.999129534 0.867682934] dice[0.998347044 0.772536576] dice[0.99758333 0.534122586] dice[0.998510718 0.800186336] dice[0.998735487 0.845393062] dice[0.998791158 0.849246562] dice[0.999388814 0.864802659] INFO:niftynet: training iter 652, loss_1=0.08929359912872314, loss_3=0.17435264587402344, loss=0.07194268703460693, loss_2=0.18861272931098938 (17.895612s) dice[0.998391449 0.802031398] dice[0.994692 0.532868] dice[0.995102 0.609237909] dice[0.998357832 0.751269341] dice[0.997788846 0.597073495] dice[0.999085844 0.823516667] dice[0.997306406 0.508917153] dice[0.998838484 0.73691231] INFO:niftynet: training iter 653, loss_1=0.16150826215744019, loss_3=0.1895064115524292, loss=0.1456337869167328, loss_2=0.16800427436828613 (17.095162s) dice[0.998388827 0.733765066] dice[0.998405874 0.674904108] dice[0.998908162 0.8177405] dice[0.996465325 0.375649393] dice[0.998783648 0.791595519] dice[0.998561382 0.679670632] dice[0.996484935 0.781460941] dice[0.998470902 0.776849866] INFO:niftynet: training iter 654, loss_1=0.2028091549873352, loss_3=0.14863401651382446, loss=0.11168333888053894, loss_2=0.13284721970558167 (18.133187s) dice[0.99668324 0.713100791] dice[0.998356938 0.58867538] dice[0.998567939 0.823629439] dice[0.99795711 0.630942] dice[0.998092234 0.791212] dice[0.998863101 0.849179208] dice[0.997756362 0.709591627] dice[0.998608947 0.813375652] INFO:niftynet: training iter 655, loss_1=0.12016686797142029, loss_3=0.13722586631774902, loss=0.0906633734703064, loss_2=0.17579591274261475 (17.882004s) dice[0.997217655 0.599314928] dice[0.999139428 0.850764096] dice[0.999066293 0.824526906] dice[0.998610318 0.760439873] dice[0.998768866 0.805674791] dice[0.999063492 0.848044336] dice[0.998665035 0.802128792] dice[0.997338593 0.43242] INFO:niftynet: training iter 656, loss_1=0.10433918237686157, loss_3=0.13839095830917358, loss=0.19236189126968384, loss_2=0.08711212873458862 (17.597269s) dice[0.995641291 0.380720913] dice[0.998521507 0.796551526] dice[0.997170329 0.698143423] dice[0.997825623 0.691832066] dice[0.997666657 0.576438844] dice[0.99816376 0.762927055] dice[0.998586655 0.803006] dice[0.998743474 0.817608535] INFO:niftynet: training iter 657, loss_1=0.15375715494155884, loss_3=0.2071411907672882, loss=0.1662009358406067, loss_2=0.09551385045051575 (16.519502s) dice[0.998314857 0.773399591] dice[0.998750269 0.80870676] dice[0.997701406 0.724361718] dice[0.998016417 0.761115909] dice[0.997929931 0.681632] dice[0.99831903 0.703634858] dice[0.998784184 0.724987924] dice[0.996926785 0.562719524] INFO:niftynet: training iter 658, loss_1=0.10520714521408081, loss_3=0.15462106466293335, loss=0.1297011375427246, loss_2=0.17914539575576782 (17.162324s) dice[0.998520076 0.713453174] dice[0.998364627 0.734980345] dice[0.99887073 0.865774691] dice[0.998663306 0.827528656] dice[0.999128103 0.906977594] dice[0.998601 0.821577966] dice[0.998476 0.757500052] dice[0.99868542 0.826107383] INFO:niftynet: training iter 659, loss_1=0.06842881441116333, loss_3=0.07729065418243408, loss=0.1048077642917633, loss_2=0.1386704444885254 (16.959487s) dice[0.998464227 0.736492515] dice[0.998665154 0.786237061] dice[0.99881649 0.632485449] dice[0.996868789 0.592235804] dice[0.997983277 0.781637311] dice[0.998748422 0.727999926] dice[0.998067617 0.708734214] dice[0.997198761 0.579863608] INFO:niftynet: training iter 660, loss_1=0.19489836692810059, loss_3=0.17903393507003784, loss=0.1234077513217926, loss_2=0.12003526091575623 (18.323478s) dice[0.998714387 0.683087885] dice[0.998474777 0.700004101] dice[0.997322 0.758338213] dice[0.999083817 0.863540053] dice[0.9989 0.84507668] dice[0.999067426 0.831537724] dice[0.99874568 0.865720868] dice[0.998672962 0.829040468] INFO:niftynet: training iter 661, loss_1=0.15492969751358032, loss_3=0.0954289436340332, loss=0.07695499062538147, loss_2=0.0813545286655426 (18.290498s) dice[0.998737872 0.819561481] dice[0.998403668 0.735615373] dice[0.998806357 0.810560882] dice[0.998971403 0.783457816] dice[0.997164488 0.48157084] dice[0.997965395 0.534950674] dice[0.998819172 0.797041178] dice[0.997607887 0.624787867] INFO:niftynet: training iter 662, loss_1=0.11192041635513306, loss_3=0.10205090045928955, loss=0.14543598890304565, loss_2=0.24708715081214905 (17.594050s) dice[0.997750521 0.481335223] dice[0.998068333 0.484408259] dice[0.998156786 0.694106102] dice[0.998906732 0.825895965] dice[0.998947799 0.873574495] dice[0.998927653 0.849875152] dice[0.998402357 0.573540926] dice[0.999244273 0.824612] INFO:niftynet: training iter 663, loss_1=0.2596094310283661, loss_3=0.069668710231781, loss=0.15105009078979492, loss_2=0.12073361873626709 (19.092234s) dice[0.998072088 0.750696659] dice[0.996929348 0.614649236] dice[0.99891609 0.851509333] dice[0.997924328 0.741003335] dice[0.998565674 0.746637881] dice[0.998737 0.6757465] dice[0.996812344 0.586829066] dice[0.998907447 0.810424924] INFO:niftynet: training iter 664, loss_1=0.15991315245628357, loss_3=0.14507824182510376, loss=0.15175655484199524, loss_2=0.10266172885894775 (17.274688s) dice[0.999002 0.871445119] dice[0.998945534 0.838646412] dice[0.998452187 0.730740547] dice[0.998338401 0.60275] dice[0.997901261 0.748733282] dice[0.998643339 0.682456553] dice[0.99795711 0.77216047] dice[0.998777926 0.78767556] INFO:niftynet: training iter 665, loss_1=0.07299023866653442, loss_3=0.16742971539497375, loss=0.11085724830627441, loss_2=0.14306640625 (16.968821s) dice[0.998687506 0.724812448] dice[0.997661889 0.713541687] dice[0.999003649 0.775472522] dice[0.998439908 0.575950682] dice[0.998093963 0.788788199] dice[0.998569131 0.736502588] dice[0.998282671 0.690014243] dice[0.998818576 0.810426831] INFO:niftynet: training iter 666, loss_1=0.16278329491615295, loss_3=0.14132410287857056, loss=0.12561443448066711, loss_2=0.11951154470443726 (17.392162s) dice[0.997897208 0.646725059] dice[0.998466372 0.635078132] dice[0.998868883 0.855454445] dice[0.998910785 0.829849958] dice[0.998743176 0.812110305] dice[0.999063 0.84646666] dice[0.998422503 0.743448555] dice[0.997669518 0.678705] INFO:niftynet: training iter 667, loss_1=0.18045830726623535, loss_3=0.0792289674282074, loss=0.14543861150741577, loss_2=0.08590421080589294 (18.237114s) dice[0.998850286 0.877587676] dice[0.997585356 0.385075152] dice[0.998811901 0.794485629] dice[0.998654902 0.783633947] dice[0.99917686 0.844464183] dice[0.997041821 0.466302782] dice[0.99856329 0.68291986] dice[0.998599946 0.714222252] INFO:niftynet: training iter 668, loss_1=0.10610342025756836, loss_3=0.18522539734840393, loss=0.15142366290092468, loss_2=0.17325359582901 (17.030560s) dice[0.998311043 0.762710392] dice[0.998776615 0.787674129] dice[0.998502076 0.746771276] dice[0.998191893 0.744284] dice[0.998779178 0.779303551] dice[0.998336494 0.752378464] dice[0.997402 0.735733688] dice[0.997268617 0.591346681] INFO:niftynet: training iter 669, loss_1=0.16956225037574768, loss_3=0.12806269526481628, loss=0.11313197016716003, loss_2=0.11780056357383728 (18.345327s) dice[0.998383045 0.754746497] dice[0.997999609 0.720999] dice[0.998046041 0.681536436] dice[0.997917 0.611640751] dice[0.998130143 0.54072988] dice[0.998117924 0.721984863] dice[0.999186099 0.853261054] dice[0.998746276 0.804977179] INFO:niftynet: training iter 670, loss_1=0.17771494388580322, loss_3=0.0859573483467102, loss=0.18525928258895874, loss_2=0.1319679617881775 (18.567291s) dice[0.998144448 0.801178515] dice[0.998705149 0.800204635] dice[0.998323619 0.749370396] dice[0.998738825 0.805301607] dice[0.996441424 0.729685307] dice[0.997775614 0.477228314] dice[0.996697068 0.522868] dice[0.998771429 0.721031666] INFO:niftynet: training iter 671, loss_1=0.19971734285354614, loss_3=0.1004418134689331, loss=0.190157949924469, loss_2=0.11206638813018799 (17.027854s) dice[0.995414197 0.334846258] dice[0.998484492 0.801247299] dice[0.998303831 0.700943589] dice[0.999151766 0.785562694] dice[0.998316824 0.747450709] dice[0.996296108 0.428610563] dice[0.998387039 0.740214586] dice[0.998187661 0.746655107] INFO:niftynet: training iter 672, loss_1=0.2175019383430481, loss_3=0.20733141899108887, loss=0.12913888692855835, loss_2=0.12900954484939575 (16.828885s) dice[0.997783899 0.598145068] dice[0.998876333 0.754719794] dice[0.998027086 0.737179518] dice[0.999108791 0.84225446] dice[0.998909712 0.833174765] dice[0.994986713 0.652445734] dice[0.998943388 0.785469949] dice[0.998930335 0.741572618] INFO:niftynet: training iter 673, loss_1=0.10585755109786987, loss_3=0.13012078404426575, loss=0.11877092719078064, loss_2=0.1626187562942505 (18.008600s) dice[0.99858737 0.828655839] dice[0.999038339 0.888694406] dice[0.998561621 0.773717284] dice[0.998652697 0.787710786] dice[0.998699903 0.788609505] dice[0.998941422 0.822209477] dice[0.997471809 0.635869861] dice[0.997843623 0.740552127] INFO:niftynet: training iter 674, loss_1=0.07125601172447205, loss_3=0.09788492321968079, loss=0.11033940315246582, loss_2=0.15706565976142883 (17.508871s) dice[0.998870134 0.818273067] dice[0.997200966 0.575481057] dice[0.998845875 0.814932287] dice[0.998335838 0.779692] dice[0.998758078 0.853022456] dice[0.997043729 0.612106] dice[0.998763919 0.757628918] dice[0.997573733 0.581014872] INFO:niftynet: training iter 675, loss_1=0.15254369378089905, loss_3=0.10204851627349854, loss=0.13476744294166565, loss_2=0.16625463962554932 (17.492241s) dice[0.998725176 0.80091387] dice[0.996522903 0.510856211] dice[0.998375773 0.784644485] dice[0.998738825 0.739607036] dice[0.999097705 0.83635664] dice[0.998553753 0.749311149] dice[0.998908639 0.822933733] dice[0.998786926 0.6281389] INFO:niftynet: training iter 676, loss_1=0.1196584701538086, loss_3=0.1378079354763031, loss=0.10417017340660095, loss_2=0.17324548959732056 (17.669965s) dice[0.996854484 0.495439202] dice[0.997229 0.637227774] dice[0.998739302 0.752390623] dice[0.998973429 0.837011456] dice[0.997223556 0.607752681] dice[0.998649478 0.803907335] dice[0.99878335 0.850360334] dice[0.998531282 0.746937871] INFO:niftynet: training iter 677, loss_1=0.21831238269805908, loss_3=0.14811676740646362, loss=0.10134679079055786, loss_2=0.10322129726409912 (17.392410s) dice[0.996283591 0.494834274] dice[0.998699486 0.770508885] dice[0.998659492 0.801139712] dice[0.998247564 0.55472064] dice[0.998321 0.723651111] dice[0.9975034 0.685933113] dice[0.999085069 0.882008791] dice[0.99842 0.768182218] INFO:niftynet: training iter 678, loss_1=0.18491843342781067, loss_3=0.16180813312530518, loss=0.08807596564292908, loss_2=0.14864784479141235 (17.144869s) dice[0.998566747 0.7949453] dice[0.998535216 0.776167929] dice[0.997211158 0.655301571] dice[0.999060571 0.861251712] dice[0.997050405 0.512705624] dice[0.998569608 0.723455131] dice[0.999267876 0.87820667] dice[0.998775601 0.740633547] INFO:niftynet: training iter 679, loss_1=0.10794618725776672, loss_3=0.12179374694824219, loss=0.09577906131744385, loss_2=0.19205480813980103 (18.058757s) dice[0.997370124 0.643940508] dice[0.998806715 0.758573651] dice[0.998967 0.894737363] dice[0.998885512 0.849977851] dice[0.998272657 0.780442894] dice[0.999273121 0.869364858] dice[0.999088645 0.842522] dice[0.99879384 0.800577044] INFO:niftynet: training iter 680, loss_1=0.08816158771514893, loss_3=0.15032723546028137, loss=0.0897546112537384, loss_2=0.06435805559158325 (17.755995s) dice[0.998259664 0.551884353] dice[0.999069929 0.83848983] dice[0.998443663 0.76846844] dice[0.999103129 0.845770299] dice[0.998611748 0.789294243] dice[0.998744249 0.808679521] dice[0.99915576 0.812656403] dice[0.998635352 0.721743464] INFO:niftynet: training iter 681, loss_1=0.15307408571243286, loss_3=0.11695224046707153, loss=0.10116755962371826, loss_2=0.09705361723899841 (18.193513s) dice[0.998431146 0.764759362] dice[0.999212563 0.890000343] dice[0.998082817 0.570559263] dice[0.999151945 0.872021079] dice[0.998829246 0.765000939] dice[0.999010623 0.850778401] dice[0.998952389 0.766697407] dice[0.998799622 0.75108707] INFO:niftynet: training iter 682, loss_1=0.08689916133880615, loss_3=0.14004620909690857, loss=0.12111586332321167, loss_2=0.09659519791603088 (17.560552s) dice[0.998874426 0.841263354] dice[0.998875856 0.817602456] dice[0.998767257 0.750761] dice[0.998639464 0.848986328] dice[0.995479405 0.658168] dice[0.998791099 0.721732378] dice[0.99884367 0.821754575] dice[0.997833908 0.717348874] INFO:niftynet: training iter 683, loss_1=0.08584600687026978, loss_3=0.10071152448654175, loss=0.11605474352836609, loss_2=0.15645727515220642 (17.553267s) dice[0.998527825 0.804868937] dice[0.998718 0.821413517] dice[0.99909085 0.875623107] dice[0.998641133 0.793367922] dice[0.999108732 0.832238] dice[0.999148071 0.866777599] dice[0.998692632 0.742702842] dice[0.998514831 0.646939278] INFO:niftynet: training iter 684, loss_1=0.09411793947219849, loss_3=0.0756818950176239, loss=0.1532875895500183, loss_2=0.08331924676895142 (16.657955s) dice[0.998821 0.86800158] dice[0.999026537 0.854468524] dice[0.998857081 0.801605046] dice[0.999129832 0.853204429] dice[0.99902916 0.852459967] dice[0.998782635 0.787311137] dice[0.99894017 0.846134782] dice[0.998225331 0.723944247] INFO:niftynet: training iter 685, loss_1=0.06992056965827942, loss_3=0.08680090308189392, loss=0.10818886756896973, loss_2=0.09060430526733398 (17.072507s) dice[0.999198139 0.870068669] dice[0.998023212 0.681412816] dice[0.999015391 0.847295344] dice[0.997547686 0.660523176] dice[0.997932 0.603641272] dice[0.998034716 0.745578706] dice[0.998474479 0.814917624] dice[0.997551084 0.634213328] INFO:niftynet: training iter 686, loss_1=0.1637033224105835, loss_3=0.11282432079315186, loss=0.1387108564376831, loss_2=0.12390461564064026 (17.543435s) dice[0.99862349 0.724297822] dice[0.998514771 0.767065346] dice[0.996336162 0.711047351] dice[0.99900645 0.860223413] dice[0.997877598 0.591531277] dice[0.999246597 0.887311816] dice[0.999040961 0.807109237] dice[0.998531103 0.675317526] INFO:niftynet: training iter 687, loss_1=0.12787461280822754, loss_3=0.13100817799568176, loss=0.1300002932548523, loss_2=0.10834667086601257 (17.001393s) dice[0.9992522 0.920437276] dice[0.998898506 0.833696604] dice[0.998774946 0.682764471] dice[0.997546732 0.650902689] dice[0.998239398 0.69306761] dice[0.998163164 0.794320107] dice[0.998950303 0.870516896] dice[0.99905169 0.651031911] INFO:niftynet: training iter 688, loss_1=0.12011229991912842, loss_3=0.06192886829376221, loss=0.12905240058898926, loss_2=0.16750279068946838 (17.264104s) dice[0.99859482 0.84107] dice[0.999006331 0.835133076] dice[0.998978853 0.824155807] dice[0.998806775 0.813724] dice[0.998125911 0.802485704] dice[0.999068797 0.834589303] dice[0.998815656 0.856510043] dice[0.997678757 0.634814203] INFO:niftynet: training iter 689, loss_1=0.09108364582061768, loss_3=0.08154895901679993, loss=0.12804532051086426, loss_2=0.09143257141113281 (17.228146s) dice[0.997749865 0.585602283] dice[0.998926163 0.844120562] dice[0.998928905 0.880457938] dice[0.997840941 0.724039435] dice[0.99849689 0.799048543] dice[0.998209774 0.734408915] dice[0.99805665 0.779765725] dice[0.998360932 0.747595668] INFO:niftynet: training iter 690, loss_1=0.11745896935462952, loss_3=0.09968316555023193, loss=0.11905524134635925, loss_2=0.14340031147003174 (17.417933s) dice[0.998220444 0.729622602] dice[0.998383 0.763950646] dice[0.998962224 0.847898841] dice[0.998272657 0.692330062] dice[0.999114454 0.852391243] dice[0.998797834 0.720087409] dice[0.998900175 0.779649556] dice[0.997131348 0.568283916] INFO:niftynet: training iter 691, loss_1=0.12745583057403564, loss_3=0.1074022650718689, loss=0.115634024143219, loss_2=0.16400876641273499 (17.999454s) dice[0.998707712 0.772814333] dice[0.999011695 0.840669274] dice[0.998441696 0.783594966] dice[0.998127878 0.752927661] dice[0.998731732 0.795800328] dice[0.999198556 0.867849827] dice[0.999245465 0.86943537] dice[0.998910427 0.825378597] INFO:niftynet: training iter 692, loss_1=0.0846048891544342, loss_3=0.09719926118850708, loss=0.0767575204372406, loss_2=0.11672696471214294 (17.530985s) dice[0.995091677 0.574697196] dice[0.998658419 0.756034] dice[0.998776138 0.812353] dice[0.998022795 0.761181] dice[0.999131501 0.89563334] dice[0.998071373 0.469210505] dice[0.999182284 0.889268875] dice[0.998628259 0.846704841] INFO:niftynet: training iter 693, loss_1=0.1688796579837799, loss_3=0.1074167788028717, loss=0.06655395030975342, loss_2=0.15948832035064697 (17.462028s) dice[0.998116791 0.776876271] dice[0.998605907 0.820317745] dice[0.998081088 0.77008003] dice[0.998386383 0.808447778] dice[0.998863757 0.769280851] dice[0.998972297 0.822181] dice[0.997885466 0.730815053] dice[0.998250961 0.709682226] INFO:niftynet: training iter 694, loss_1=0.14084157347679138, loss_3=0.10152080655097961, loss=0.10267552733421326, loss_2=0.10625118017196655 (17.642717s) dice[0.997600079 0.612304807] dice[0.997613192 0.712009549] dice[0.998093843 0.69062537] dice[0.998333573 0.725558162] dice[0.998858 0.738900721] dice[0.998072743 0.459651083] dice[0.997191787 0.724897921] dice[0.99814117 0.800753355] INFO:niftynet: training iter 695, loss_1=0.17011809349060059, loss_3=0.14684724807739258, loss=0.11975392699241638, loss_2=0.20112937688827515 (16.607640s) dice[0.998972595 0.846425653] dice[0.998278916 0.740238309] dice[0.998665452 0.796711385] dice[0.998553276 0.810996771] dice[0.995569706 0.672636628] dice[0.997113466 0.763121367] dice[0.998253286 0.768747151] dice[0.998574197 0.765166879] INFO:niftynet: training iter 696, loss_1=0.11900603771209717, loss_3=0.10402113199234009, loss=0.1411983072757721, loss_2=0.09876829385757446 (18.081899s) dice[0.997755706 0.790463626] dice[0.998439312 0.675870121] dice[0.999085 0.878529787] dice[0.998627245 0.780449629] dice[0.997852921 0.68208909] dice[0.99867183 0.789618134] dice[0.998613477 0.783780098] dice[0.998122275 0.779771626] INFO:niftynet: training iter 697, loss_1=0.13436779379844666, loss_3=0.08582711219787598, loss=0.10992813110351562, loss_2=0.13294199109077454 (17.120511s) dice[0.999236941 0.89498812] dice[0.998786 0.851783574] dice[0.998695 0.747638643] dice[0.999193609 0.809907] dice[0.99905771 0.87713] dice[0.998552501 0.628936529] dice[0.998579502 0.807284534] dice[0.998393238 0.60032922] INFO:niftynet: training iter 698, loss_1=0.12408080697059631, loss_3=0.11114144325256348, loss=0.1488533616065979, loss_2=0.0638013482093811 (18.362535s) dice[0.999226868 0.900337338] dice[0.998767614 0.773877144] dice[0.998198628 0.820756614] dice[0.998892426 0.810826361] dice[0.999002934 0.868477881] dice[0.998426557 0.530898631] dice[0.998859465 0.846711099] dice[0.998742521 0.77802521] INFO:niftynet: training iter 699, loss_1=0.08194774389266968, loss_3=0.09283149242401123, loss=0.09441542625427246, loss_2=0.150798499584198 (17.432488s) dice[0.998159349 0.716209] dice[0.999014258 0.764786541] dice[0.999020636 0.771806657] dice[0.998872519 0.817364633] dice[0.995305419 0.592859387] dice[0.997209787 0.395409256] dice[0.99893707 0.764093] dice[0.99916786 0.812209606] INFO:niftynet: training iter 700, loss_1=0.2548040449619293, loss_3=0.13045769929885864, loss=0.10639810562133789, loss_2=0.10323387384414673 (18.578490s) INFO:niftynet: iter 700 saved: /home/user/NiftyNet/models/kidneyFATMAP/models/model.ckpt dice[0.99881053 0.872171342] dice[0.998494267 0.479055792] dice[0.999087036 0.807290196] dice[0.998829067 0.851243496] dice[0.998821795 0.865279078] dice[0.997848094 0.741867244] dice[0.997847259 0.720387757] dice[0.99847883 0.497823656] INFO:niftynet: training iter 701, loss_1=0.09904593229293823, loss_3=0.16286700963974, loss=0.196365624666214, loss_2=0.08588755130767822 (16.169586s) dice[0.996721447 0.710773945] dice[0.998370528 0.694480419] dice[0.99797374 0.684785247] dice[0.99873203 0.793247044] dice[0.998289764 0.835250139] dice[0.998892069 0.875493228] dice[0.999127388 0.861405849] dice[0.998717666 0.793938875] INFO:niftynet: training iter 702, loss_1=0.14991340041160583, loss_3=0.08670255541801453, loss=0.1313154697418213, loss_2=0.07301867008209229 (17.572834s) dice[0.998655677 0.721256852] dice[0.999175906 0.88241756] dice[0.995261848 0.647926092] dice[0.998687506 0.778918505] dice[0.9987939 0.775885105] dice[0.998857081 0.786063492] dice[0.998959899 0.892391622] dice[0.998984158 0.876163721] INFO:niftynet: training iter 703, loss_1=0.11010012030601501, loss_3=0.09962350130081177, loss=0.05837517976760864, loss_2=0.14480149745941162 (17.414215s) dice[0.999108 0.876731873] dice[0.998699069 0.831287324] dice[0.997875929 0.741014719] dice[0.998824894 0.827902436] dice[0.999049842 0.879862547] dice[0.998128057 0.726134241] dice[0.998574734 0.837758362] dice[0.99864608 0.73630935] INFO:niftynet: training iter 704, loss_1=0.07354342937469482, loss_3=0.10859549045562744, loss=0.10717788338661194, loss_2=0.09920632839202881 (17.315561s) dice[0.998024 0.777121305] dice[0.99808991 0.804333925] dice[0.998548925 0.787170231] dice[0.998970687 0.859861374] dice[0.998538733 0.624027312] dice[0.999048948 0.771661282] dice[0.999269664 0.890534639] dice[0.998861194 0.793648899] INFO:niftynet: training iter 705, loss_1=0.15168094635009766, loss_3=0.10560771822929382, loss=0.07942140102386475, loss_2=0.08886218070983887 (17.877228s) dice[0.99887377 0.778321624] dice[0.998632669 0.828174591] dice[0.996861279 0.702039] dice[0.998364627 0.833286703] dice[0.998425663 0.827129245] dice[0.996868849 0.613803506] dice[0.998224735 0.795330822] dice[0.998865 0.849818826] INFO:niftynet: training iter 706, loss_1=0.09899932146072388, loss_3=0.14094319939613342, loss=0.1173621118068695, loss_2=0.08944016695022583 (18.222793s) dice[0.998517871 0.765947402] dice[0.99802351 0.694333] dice[0.998129964 0.804651558] dice[0.998731792 0.804733634] dice[0.999117255 0.90490675] dice[0.998553157 0.765373528] dice[0.998126566 0.712992847] dice[0.99781394 0.72725153] INFO:niftynet: training iter 707, loss_1=0.09843826293945312, loss_3=0.08301231265068054, loss=0.14095377922058105, loss_2=0.1357945203781128 (17.342570s) dice[0.998770654 0.803546309] dice[0.998108 0.624575555] dice[0.998854935 0.843988061] dice[0.999176502 0.883362234] dice[0.99898231 0.864233494] dice[0.999024 0.825144708] dice[0.998731613 0.804034412] dice[0.998036802 0.793796659] INFO:niftynet: training iter 708, loss_1=0.1437498927116394, loss_3=0.07815387845039368, loss=0.06865453720092773, loss_2=0.10135012865066528 (17.051959s) dice[0.998313963 0.821009576] dice[0.99914062 0.846191704] dice[0.998138547 0.758891582] dice[0.998940647 0.831712067] dice[0.998113871 0.755159855] dice[0.999208748 0.854331136] dice[0.998584747 0.755483449] dice[0.998833835 0.64733392] INFO:niftynet: training iter 709, loss_1=0.09829658269882202, loss_3=0.10307928919792175, loss=0.14994102716445923, loss_2=0.08383601903915405 (17.497211s) dice[0.998753846 0.850140929] dice[0.999316335 0.895824313] dice[0.996360362 0.666748822] dice[0.999015391 0.860230148] dice[0.998742163 0.838451505] dice[0.999076962 0.875563] dice[0.998713136 0.717811942] dice[0.998728931 0.849722] INFO:niftynet: training iter 710, loss_1=0.11941131949424744, loss_3=0.06399112939834595, loss=0.10875600576400757, loss_2=0.0720415711402893 (17.830792s) dice[0.997581482 0.758576214] dice[0.999137 0.80327177] dice[0.999161363 0.839692116] dice[0.998886406 0.659500182] dice[0.99930191 0.891400278] dice[0.999221861 0.826292634] dice[0.998565555 0.830770493] dice[0.997200131 0.645150721] INFO:niftynet: training iter 711, loss_1=0.12568998336791992, loss_3=0.07094579935073853, loss=0.13207826018333435, loss_2=0.11035841703414917 (16.527179s) dice[0.998067141 0.762210667] dice[0.997424603 0.619322956] dice[0.998908818 0.796717227] dice[0.999424696 0.910651505] dice[0.998131514 0.753784716] dice[0.998896718 0.839123] dice[0.997222364 0.770002902] dice[0.998858 0.871656895] INFO:niftynet: training iter 712, loss_1=0.10251602530479431, loss_3=0.07357445359230042, loss=0.09056496620178223, loss_2=0.15574365854263306 (16.726404s) dice[0.998717189 0.796170473] dice[0.998731613 0.728545189] dice[0.998284 0.76680845] dice[0.998670936 0.792000413] dice[0.998999298 0.78437376] dice[0.997885048 0.661431313] dice[0.997981071 0.727477551] dice[0.997009158 0.626074255] INFO:niftynet: training iter 713, loss_1=0.13932764530181885, loss_3=0.1194588840007782, loss=0.16286450624465942, loss_2=0.11105906963348389 (17.799268s) dice[0.996798277 0.580588281] dice[0.998506248 0.748325825] dice[0.998251498 0.792557836] dice[0.998041689 0.663351357] dice[0.997899234 0.661785722] dice[0.999051571 0.881785] dice[0.998779237 0.811074078] dice[0.998369217 0.680817723] INFO:niftynet: training iter 714, loss_1=0.1689453125, loss_3=0.11486965417861938, loss=0.13694939017295837, loss_2=0.12773993611335754 (17.327370s) dice[0.999049842 0.873395264] dice[0.998548806 0.76138252] dice[0.997935534 0.778286397] dice[0.99908936 0.860683203] dice[0.998499572 0.792888045] dice[0.996714413 0.627849817] dice[0.999050558 0.854835927] dice[0.997910202 0.709424198] INFO:niftynet: training iter 715, loss_1=0.09190589189529419, loss_3=0.09100136160850525, loss=0.10969477891921997, loss_2=0.1460120677947998 (17.338521s) dice[0.996904969 0.806260467] dice[0.999194384 0.807403207] dice[0.9988 0.859852] dice[0.998094916 0.67259413] dice[0.998496175 0.599018455] dice[0.998731494 0.720498502] dice[0.999320269 0.89931047] dice[0.998788595 0.828180254] INFO:niftynet: training iter 716, loss_1=0.09755924344062805, loss_3=0.11766472458839417, loss=0.06860008835792542, loss_2=0.17081385850906372 (17.395103s) dice[0.998569608 0.708690345] dice[0.998833656 0.783579946] dice[0.999096274 0.874931276] dice[0.998819292 0.74308145] dice[0.99905926 0.907922566] dice[0.997643113 0.634677887] dice[0.999161839 0.855634153] dice[0.998351 0.65047878] INFO:niftynet: training iter 717, loss_1=0.1151742935180664, loss_3=0.1275816261768341, loss=0.12409359216690063, loss_2=0.09601795673370361 (17.125979s) dice[0.998798609 0.806369543] dice[0.99912715 0.772611618] dice[0.998952568 0.678206503] dice[0.99538815 0.653833] dice[0.999063551 0.737502575] dice[0.999058545 0.884716] dice[0.99650526 0.71514684] dice[0.998209596 0.698978424] INFO:niftynet: training iter 718, loss_1=0.09491485357284546, loss_3=0.14778998494148254, loss=0.1684049367904663, loss_2=0.10577327013015747 (17.602941s) dice[0.999098182 0.909289] dice[0.998936474 0.886098087] dice[0.998752058 0.822242379] dice[0.998168349 0.718834281] dice[0.99890244 0.828949] dice[0.999023855 0.79237175] dice[0.998087227 0.741869152] dice[0.99650085 0.799851716] INFO:niftynet: training iter 719, loss_1=0.05164456367492676, loss_3=0.09518823027610779, loss=0.11550074815750122, loss_2=0.11592274904251099 (18.130572s) dice[0.997987866 0.748365581] dice[0.998518825 0.8427912] dice[0.998533368 0.761383772] dice[0.998178244 0.759080887] dice[0.99812758 0.52708751] dice[0.996561468 0.485414028] dice[0.995437562 0.729706287] dice[0.998440206 0.570712328] INFO:niftynet: training iter 720, loss_1=0.24820232391357422, loss_3=0.12070593237876892, loss=0.17642587423324585, loss_2=0.10308414697647095 (18.558460s) dice[0.999074459 0.833602846] dice[0.998533785 0.781311274] dice[0.999106884 0.831755877] dice[0.999055862 0.846845806] dice[0.999067724 0.887691319] dice[0.99810797 0.719668627] dice[0.998779476 0.72626549] dice[0.997800529 0.626430452] INFO:niftynet: training iter 721, loss_1=0.08080887794494629, loss_3=0.09886610507965088, loss=0.16268101334571838, loss_2=0.09686940908432007 (18.036618s) dice[0.999097407 0.650723815] dice[0.998982906 0.854299724] dice[0.999280572 0.857739806] dice[0.999078453 0.8500489] dice[0.998750687 0.811393321] dice[0.998424053 0.838122904] dice[0.99923 0.8930161] dice[0.999258399 0.917689204] INFO:niftynet: training iter 722, loss_1=0.04770156741142273, loss_3=0.0734630823135376, loss=0.12422400712966919, loss_2=0.08832728862762451 (17.649312s) dice[0.998304248 0.692290366] dice[0.998915374 0.833457232] dice[0.998146117 0.669194341] dice[0.999173105 0.829436541] dice[0.998598814 0.838426769] dice[0.998548269 0.677326202] dice[0.998359561 0.511587858] dice[0.998993278 0.869505227] INFO:niftynet: training iter 723, loss_1=0.12601244449615479, loss_3=0.1553885042667389, loss=0.12177497148513794, loss_2=0.11925816535949707 (17.600287s) dice[0.998980045 0.857507169] dice[0.998812556 0.797296941] dice[0.998230517 0.790140271] dice[0.997157633 0.658456385] dice[0.998956 0.857902467] dice[0.998823643 0.752124846] dice[0.998529553 0.839593768] dice[0.9971596 0.650983334] INFO:niftynet: training iter 724, loss_1=0.09804826974868774, loss_3=0.08685082197189331, loss=0.1284334361553192, loss_2=0.13900378346443176 (17.641725s) dice[0.998387396 0.611269414] dice[0.998970807 0.763311446] dice[0.998701096 0.728144288] dice[0.998898864 0.820923328] dice[0.998894036 0.855876565] dice[0.998689175 0.824510396] dice[0.996653199 0.765152454] dice[0.998450339 0.771442056] INFO:niftynet: training iter 725, loss_1=0.08050745725631714, loss_3=0.11333310604095459, loss=0.11707547307014465, loss_2=0.15701523423194885 (17.677060s) dice[0.998800755 0.628906] dice[0.997449696 0.67875278] dice[0.999197125 0.878173053] dice[0.998209774 0.674233139] dice[0.999058604 0.890215576] dice[0.99921149 0.871794581] dice[0.99926585 0.898846388] dice[0.998784781 0.745663822] INFO:niftynet: training iter 726, loss_1=0.11254674196243286, loss_3=0.05992996692657471, loss=0.08935976028442383, loss_2=0.17402267456054688 (16.084571s) dice[0.99900192 0.829326868] dice[0.998968542 0.836588681] dice[0.997455478 0.727622271] dice[0.996136785 0.666163504] dice[0.998534739 0.836978734] dice[0.99839884 0.659828] dice[0.998683751 0.842784107] dice[0.997986555 0.657828867] INFO:niftynet: training iter 727, loss_1=0.12567919492721558, loss_3=0.08402848243713379, loss=0.15315547585487366, loss_2=0.12656491994857788 (16.942179s) dice[0.99878037 0.800988674] dice[0.99866122 0.793046951] dice[0.998577058 0.699254096] dice[0.998824477 0.768847048] dice[0.999420345 0.908393681] dice[0.999027133 0.837207794] dice[0.995777667 0.636946738] dice[0.999281526 0.863250494] INFO:niftynet: training iter 728, loss_1=0.10213068127632141, loss_3=0.06398776173591614, loss=0.12618589401245117, loss_2=0.13362431526184082 (16.782599s) dice[0.998941302 0.81826365] dice[0.999048829 0.849550605] dice[0.998406887 0.748650551] dice[0.997674644 0.709601283] dice[0.99861747 0.591902375] dice[0.999082804 0.868289411] dice[0.999187469 0.811952055] dice[0.999008834 0.833625257] INFO:niftynet: training iter 729, loss_1=0.13552695512771606, loss_3=0.083548903465271, loss=0.0890565812587738, loss_2=0.13641667366027832 (16.623366s) dice[0.998712301 0.684748232] dice[0.999017715 0.855582058] dice[0.998829 0.838993728] dice[0.998657942 0.738613427] dice[0.998109162 0.760948479] dice[0.998675168 0.734927535] dice[0.997611523 0.691106915] dice[0.999070883 0.901322126] INFO:niftynet: training iter 730, loss_1=0.10622650384902954, loss_3=0.11548495292663574, loss=0.10272213816642761, loss_2=0.126834899187088 (18.729242s) dice[0.998979 0.795233965] dice[0.999307394 0.868934453] dice[0.998628259 0.778688192] dice[0.998702168 0.852025747] dice[0.998469234 0.793761492] dice[0.999061108 0.854646] dice[0.998934388 0.787970662] dice[0.998165071 0.674897969] INFO:niftynet: training iter 731, loss_1=0.08438631892204285, loss_3=0.09298890829086304, loss=0.13500797748565674, loss_2=0.08851554989814758 (17.591175s) dice[0.998379886 0.628451169] dice[0.998033702 0.527625918] dice[0.998287559 0.588187039] dice[0.999030232 0.820886552] dice[0.998444319 0.830746591] dice[0.99719131 0.583211362] dice[0.998551846 0.790266335] dice[0.998669147 0.825931907] INFO:niftynet: training iter 732, loss_1=0.21187734603881836, loss_3=0.14760160446166992, loss=0.09664520621299744, loss_2=0.1484021544456482 (17.517793s) dice[0.998880863 0.835339725] dice[0.998377621 0.78350991] dice[0.998974919 0.872134447] dice[0.998067 0.695233047] dice[0.999041438 0.849086761] dice[0.99890548 0.809816897] dice[0.999211967 0.742794096] dice[0.999053419 0.812446356] INFO:niftynet: training iter 733, loss_1=0.10889765620231628, loss_3=0.0959729552268982, loss=0.11162352561950684, loss_2=0.08578735589981079 (35.961315s) dice[0.997985184 0.687875867] dice[0.998892486 0.805066705] dice[0.999095261 0.897449] dice[0.9980551 0.755360842] dice[0.998518527 0.553562045] dice[0.998586 0.811841786] dice[0.996182501 0.488645732] dice[0.999028921 0.862082958] INFO:niftynet: training iter 734, loss_1=0.08750995993614197, loss_3=0.15937289595603943, loss=0.16351497173309326, loss_2=0.12754493951797485 (18.549423s) dice[0.999012411 0.884791613] dice[0.998631954 0.733936727] dice[0.999113262 0.878643513] dice[0.999394417 0.897249] dice[0.998759508 0.820955575] dice[0.99873054 0.79094243] dice[0.998196661 0.767963409] dice[0.998763 0.802767456] INFO:niftynet: training iter 735, loss_1=0.09590679407119751, loss_3=0.05639994144439697, loss=0.10807737708091736, loss_2=0.09765300154685974 (17.657636s) dice[0.999160528 0.856169879] dice[0.999251425 0.797978401] dice[0.997492075 0.639835298] dice[0.997840345 0.675670147] dice[0.996614099 0.539331198] dice[0.999104261 0.852252901] dice[0.997162879 0.66362] dice[0.997906 0.669448376] INFO:niftynet: training iter 736, loss_1=0.15317437052726746, loss_3=0.17229050397872925, loss=0.16796571016311646, loss_2=0.08685994148254395 (18.412692s) dice[0.998143196 0.821071148] dice[0.998772085 0.858799756] dice[0.998337209 0.801871181] dice[0.99899286 0.846639752] dice[0.999131203 0.872734308] dice[0.999099 0.822267354] dice[0.999213457 0.888909459] dice[0.99918 0.880331337] INFO:niftynet: training iter 737, loss_1=0.07669204473495483, loss_3=0.08080345392227173, loss=0.058091431856155396, loss_2=0.08853977918624878 (18.922348s) dice[0.998443186 0.703110397] dice[0.999116182 0.899492264] dice[0.999160826 0.8423118] dice[0.998069167 0.755412519] dice[0.998790085 0.830294371] dice[0.996363759 0.710225821] dice[0.9987849 0.789187968] dice[0.999059 0.806676328] INFO:niftynet: training iter 738, loss_1=0.1012614369392395, loss_3=0.1015729308128357, loss=0.11608150601387024, loss_2=0.09995949268341064 (17.543129s) dice[0.996470749 0.706495166] dice[0.999189436 0.888542593] dice[0.999175608 0.829223871] dice[0.998912215 0.833837509] dice[0.998758674 0.839406967] dice[0.998659551 0.717230558] dice[0.999062657 0.840167642] dice[0.998959422 0.77834785] INFO:niftynet: training iter 739, loss_1=0.10232549905776978, loss_3=0.09586560726165771, loss=0.1114860475063324, loss_2=0.08471271395683289 (19.293026s) dice[0.998835504 0.859113574] dice[0.998237967 0.702209771] dice[0.996045887 0.668918848] dice[0.998585105 0.686804771] dice[0.99778676 0.653216958] dice[0.998993516 0.844357312] dice[0.995090187 0.722976863] dice[0.999523044 0.920839548] INFO:niftynet: training iter 740, loss_1=0.0903925895690918, loss_3=0.11040079593658447, loss=0.1264113485813141, loss_2=0.16241136193275452 (17.377713s) dice[0.998900473 0.86187911] dice[0.998908758 0.809776068] dice[0.998726785 0.834869087] dice[0.99911195 0.798645] dice[0.997206 0.652620137] dice[0.999076 0.854563177] dice[0.996717751 0.740760922] dice[0.999206662 0.862799168] INFO:niftynet: training iter 741, loss_1=0.09216180443763733, loss_3=0.08263391256332397, loss=0.10012885928153992, loss_2=0.12413370609283447 (17.931894s) dice[0.9985587 0.798869371] dice[0.998073697 0.63367492] dice[0.99889648 0.817358255] dice[0.998953104 0.819028914] dice[0.99927 0.792822421] dice[0.99903518 0.817512155] dice[0.998602033 0.780200541] dice[0.998656213 0.821513295] INFO:niftynet: training iter 742, loss_1=0.10025697946548462, loss_3=0.0914408266544342, loss=0.09784004092216492, loss_2=0.1427057981491089 (18.273538s) dice[0.999204 0.860398173] dice[0.999181509 0.861208081] dice[0.999031901 0.787724793] dice[0.998911858 0.834731] dice[0.99934566 0.849412143] dice[0.999097407 0.889308] dice[0.999164581 0.879051507] dice[0.99837172 0.804989576] INFO:niftynet: training iter 743, loss_1=0.07000204920768738, loss_3=0.09490010142326355, loss=0.07960566878318787, loss_2=0.06570923328399658 (19.178171s) dice[0.998858273 0.866655648] dice[0.998808265 0.835546494] dice[0.999047279 0.882722318] dice[0.998503268 0.830671668] dice[0.996568 0.746047318] dice[0.998338878 0.608815134] dice[0.998370588 0.75251472] dice[0.999239564 0.836517513] INFO:niftynet: training iter 744, loss_1=0.07503283023834229, loss_3=0.1625576615333557, loss=0.10333943367004395, loss_2=0.07226383686065674 (19.145044s) dice[0.99865061 0.846453786] dice[0.998350561 0.695019901] dice[0.999103189 0.870074] dice[0.999224722 0.804394364] dice[0.9991557 0.880182087] dice[0.998711884 0.80550164] dice[0.998906 0.832226932] dice[0.999355674 0.89559865] INFO:niftynet: training iter 745, loss_1=0.11538127064704895, loss_3=0.08180093765258789, loss=0.06847819685935974, loss_2=0.07911217212677002 (17.696028s) dice[0.999109745 0.842505872] dice[0.99889791 0.762369871] dice[0.999198139 0.874868393] dice[0.999084473 0.844476342] dice[0.997974873 0.73192513] dice[0.999037325 0.857553959] dice[0.998439 0.513060629] dice[0.99868387 0.783342] INFO:niftynet: training iter 746, loss_1=0.09927913546562195, loss_3=0.10337719321250916, loss=0.07059314846992493, loss_2=0.17661863565444946 (17.805824s) dice[0.998316169 0.744320869] dice[0.998404324 0.8299191] dice[0.998540342 0.836144328] dice[0.998774409 0.783670247] dice[0.999465108 0.885176837] dice[0.999400496 0.865159] dice[0.9982844 0.79624176] dice[0.998115838 0.739794374] INFO:niftynet: training iter 747, loss_1=0.10725989937782288, loss_3=0.0957176685333252, loss=0.11689090728759766, loss_2=0.06269967555999756 (18.307860s) dice[0.999130189 0.855408907] dice[0.998253584 0.583087385] dice[0.998833537 0.767549217] dice[0.999209106 0.905670643] dice[0.999271 0.883957207] dice[0.999292195 0.899822116] dice[0.999249578 0.867343962] dice[0.999236941 0.893438458] INFO:niftynet: training iter 748, loss_1=0.08218437433242798, loss_3=0.141029953956604, loss=0.06018275022506714, loss_2=0.05441436171531677 (18.208760s) dice[0.998675883 0.854995549] dice[0.999187231 0.88684994] dice[0.998329222 0.786916614] dice[0.998706043 0.828213155] dice[0.997705221 0.846389294] dice[0.998584807 0.763755202] dice[0.997972667 0.738480747] dice[0.998747885 0.708380818] INFO:niftynet: training iter 749, loss_1=0.09695872664451599, loss_3=0.09839135408401489, loss=0.1391044557094574, loss_2=0.06507286429405212 (17.461712s) dice[0.998963416 0.859724402] dice[0.998720825 0.643663883] dice[0.999211729 0.85851866] dice[0.9994 0.892071664] dice[0.998631835 0.858440399] dice[0.997991204 0.646553695] dice[0.998400331 0.748774052] dice[0.99755466 0.617650747] INFO:niftynet: training iter 750, loss_1=0.12459573149681091, loss_3=0.12473183870315552, loss=0.15940505266189575, loss_2=0.06269949674606323 (18.284680s) dice[0.999428749 0.872761786] dice[0.998856366 0.635322452] dice[0.999253809 0.874088705] dice[0.998333216 0.763322949] dice[0.998728216 0.832630694] dice[0.99815768 0.741478205] dice[0.998616457 0.769365191] dice[0.998868525 0.833190203] INFO:niftynet: training iter 751, loss_1=0.12340766191482544, loss_3=0.09998992085456848, loss=0.10725128650665283, loss_2=0.09125033020973206 (17.881378s) dice[0.999249756 0.861357212] dice[0.998908937 0.840289414] dice[0.998334646 0.843567967] dice[0.999114156 0.821634] dice[0.99908 0.862347662] dice[0.998791158 0.834508121] dice[0.999219239 0.885733] dice[0.998295605 0.786116421] INFO:niftynet: training iter 752, loss_1=0.07504868507385254, loss_3=0.08433732390403748, loss=0.08265891671180725, loss_2=0.07631826400756836 (18.482673s) dice[0.998941064 0.784839034] dice[0.999232 0.865926206] dice[0.999079883 0.840597153] dice[0.998267293 0.781218827] dice[0.995924234 0.752251685] dice[0.99898541 0.775199294] dice[0.998424113 0.655354202] dice[0.999053955 0.878244162] INFO:niftynet: training iter 753, loss_1=0.11940985918045044, loss_3=0.08776542544364929, loss=0.11723089218139648, loss_2=0.09520918130874634 (17.721374s) dice[0.997833312 0.596997857] dice[0.998795569 0.817906499] dice[0.99858278 0.452161193] dice[0.998681605 0.849253654] dice[0.997464955 0.800825238] dice[0.998623312 0.825655282] dice[0.998712063 0.852933884] dice[0.999254167 0.843670189] INFO:niftynet: training iter 754, loss_1=0.07635742425918579, loss_3=0.14711666107177734, loss=0.09435778856277466, loss_2=0.17533022165298462 (18.570372s) dice[0.999325275 0.935283363] dice[0.998634458 0.83814919] dice[0.999172926 0.871027] dice[0.998809636 0.797135234] dice[0.999163151 0.906015933] dice[0.998740673 0.822205544] dice[0.998808801 0.82913518] dice[0.99909693 0.816219091] INFO:niftynet: training iter 755, loss_1=0.06846868991851807, loss_3=0.0571519136428833, loss=0.08918499946594238, loss_2=0.08346378803253174 (17.876210s) dice[0.999036133 0.903221667] dice[0.998950899 0.71396488] dice[0.998730302 0.812020898] dice[0.998987496 0.857845068] dice[0.997887671 0.77652812] dice[0.998847425 0.827863753] dice[0.998102844 0.695869] dice[0.999111295 0.856590629] INFO:niftynet: training iter 756, loss_1=0.09971827268600464, loss_3=0.09620660543441772, loss=0.11258155107498169, loss_2=0.08310407400131226 (18.513555s) dice[0.996992767 0.784056723] dice[0.998839319 0.678877] dice[0.998821497 0.797918677] dice[0.999109924 0.897789896] dice[0.998883247 0.86776334] dice[0.999148488 0.875868917] dice[0.999013364 0.857234895] dice[0.998936653 0.831357837] INFO:niftynet: training iter 757, loss_1=0.06458401679992676, loss_3=0.07659000158309937, loss=0.0783643126487732, loss_2=0.13530856370925903 (18.294412s) dice[0.998932779 0.855891228] dice[0.999371767 0.896340609] dice[0.998720109 0.835630715] dice[0.997818828 0.555705667] dice[0.997932792 0.642571] dice[0.998470545 0.690016091] dice[0.999014854 0.841302812] dice[0.998997569 0.785878956] INFO:niftynet: training iter 758, loss_1=0.15303117036819458, loss_3=0.06236588954925537, loss=0.09370142221450806, loss_2=0.16775241494178772 (17.540482s) dice[0.998301148 0.669482231] dice[0.999032319 0.801407754] dice[0.998300135 0.739392459] dice[0.998699844 0.790594637] dice[0.998336434 0.813855648] dice[0.998481 0.836227655] dice[0.999237835 0.895294905] dice[0.999158144 0.875027478] INFO:niftynet: training iter 759, loss_1=0.08827480673789978, loss_3=0.13294413685798645, loss=0.11825323104858398, loss_2=0.05782043933868408 (17.017406s) dice[0.999324083 0.893579245] dice[0.998732686 0.640516579] dice[0.99835813 0.831053793] dice[0.998940825 0.828101933] dice[0.998631537 0.778891206] dice[0.998617411 0.711859584] dice[0.999285579 0.863531709] dice[0.999150395 0.81255877] INFO:niftynet: training iter 760, loss_1=0.12800005078315735, loss_3=0.08136838674545288, loss=0.08588629961013794, loss_2=0.11696183681488037 (18.070856s) dice[0.998650789 0.822172225] dice[0.999204397 0.868052363] dice[0.998779833 0.811783731] dice[0.998699546 0.815908372] dice[0.998815 0.733842254] dice[0.998803616 0.799515545] dice[0.99846822 0.770564437] dice[0.999089181 0.860615075] INFO:niftynet: training iter 761, loss_1=0.11725592613220215, loss_3=0.0937071442604065, loss=0.09281575679779053, loss_2=0.07798007130622864 (17.789326s) dice[0.998116195 0.801017284] dice[0.998041213 0.693486214] dice[0.998917162 0.762985468] dice[0.999348223 0.892087221] dice[0.999440134 0.891404808] dice[0.999057829 0.880234599] dice[0.995467544 0.668914378] dice[0.99869293 0.848088384] INFO:niftynet: training iter 762, loss_1=0.08666551113128662, loss_3=0.05746564269065857, loss=0.12220919132232666, loss_2=0.12733477354049683 (17.927228s) dice[0.998482347 0.693019271] dice[0.99854207 0.792157888] dice[0.998847723 0.707718134] dice[0.997794747 0.526979148] dice[0.996011376 0.688285172] dice[0.999054968 0.778979778] dice[0.999372661 0.902643263] dice[0.999329388 0.856274068] INFO:niftynet: training iter 763, loss_1=0.13441717624664307, loss_3=0.12944960594177246, loss=0.060595154762268066, loss_2=0.1921650767326355 (17.766066s) dice[0.998830914 0.859610796] dice[0.998738885 0.779465079] dice[0.998864293 0.709608138] dice[0.998344183 0.737377048] dice[0.998731732 0.837487757] dice[0.999160409 0.907291472] dice[0.999049783 0.90995723] dice[0.998227298 0.521030843] INFO:niftynet: training iter 764, loss_1=0.09083858132362366, loss_3=0.13895156979560852, loss=0.1429336965084076, loss_2=0.06433212757110596 (19.042748s) dice[0.99867785 0.734235525] dice[0.99906534 0.853858769] dice[0.99858427 0.793713927] dice[0.998738587 0.78484869] dice[0.998889267 0.7242] dice[0.99925983 0.838715792] dice[0.997851968 0.6316185] dice[0.999088228 0.852235913] INFO:niftynet: training iter 765, loss_1=0.10354062914848328, loss_3=0.10973379015922546, loss=0.12980136275291443, loss_2=0.10602864623069763 (17.985633s) dice[0.999302506 0.881215] dice[0.999165118 0.891678154] dice[0.999090612 0.815428138] dice[0.999189258 0.883915424] dice[0.998355925 0.807781458] dice[0.998466074 0.783235669] dice[0.997736454 0.578611493] dice[0.997822642 0.739105165] INFO:niftynet: training iter 766, loss_1=0.05715981125831604, loss_3=0.17168107628822327, loss=0.10304021835327148, loss_2=0.07559415698051453 (17.973912s) dice[0.998899817 0.845229208] dice[0.999091566 0.875221] dice[0.994928062 0.619957685] dice[0.997540891 0.701458156] dice[0.998731077 0.749161601] dice[0.998457968 0.569872081] dice[0.997549891 0.673510969] dice[0.999011338 0.804987669] INFO:niftynet: training iter 767, loss_1=0.17094433307647705, loss_3=0.07038956880569458, loss=0.1312350630760193, loss_2=0.17152878642082214 (18.044233s) dice[0.998580515 0.83412075] dice[0.999215782 0.845075905] dice[0.999168396 0.856452346] dice[0.999265075 0.8546983] dice[0.997739196 0.85559082] dice[0.998723507 0.784051299] dice[0.996879637 0.776091397] dice[0.996545792 0.491421282] INFO:niftynet: training iter 768, loss_1=0.08075177669525146, loss_3=0.09097379446029663, loss=0.0726039707660675, loss_2=0.18476548790931702 (18.184808s) dice[0.997688115 0.707900822] dice[0.998698771 0.807441771] dice[0.998273551 0.724385202] dice[0.998572469 0.680933475] dice[0.998205 0.80415374] dice[0.998765826 0.747416079] dice[0.999010503 0.857768297] dice[0.997606456 0.695471764] INFO:niftynet: training iter 769, loss_1=0.12206763029098511, loss_3=0.11286482214927673, loss=0.11253571510314941, loss_2=0.14945882558822632 (18.341984s) dice[0.998544753 0.779326558] dice[0.997967899 0.755337477] dice[0.999102831 0.895742238] dice[0.998333812 0.839489281] dice[0.998536825 0.779458463] dice[0.999151945 0.910478354] dice[0.99912864 0.910332084] dice[0.997882128 0.693403] INFO:niftynet: training iter 770, loss_1=0.11720579862594604, loss_3=0.09981352090835571, loss=0.07809358835220337, loss_2=0.06683295965194702 (18.051171s) dice[0.998802 0.838542938] dice[0.997054815 0.784084141] dice[0.999219596 0.805187702] dice[0.997709036 0.727944314] dice[0.996976852 0.603294849] dice[0.999237359 0.837429166] dice[0.998602211 0.786852598] dice[0.998714149 0.845954716] INFO:niftynet: training iter 771, loss_1=0.14076542854309082, loss_3=0.0953790545463562, loss=0.09246909618377686, loss_2=0.11748480796813965 (18.170535s) dice[0.998265922 0.729201317] dice[0.999234617 0.926598] dice[0.999063909 0.780514419] dice[0.999420345 0.892163754] dice[0.99877274 0.816298425] dice[0.998773515 0.819027305] dice[0.998373806 0.786875486] dice[0.999110937 0.907719493] INFO:niftynet: training iter 772, loss_1=0.08220937848091125, loss_3=0.0866750180721283, loss=0.07698008418083191, loss_2=0.09178203344345093 (17.877590s) dice[0.99938494 0.902430356] dice[0.998995543 0.762164354] dice[0.999291062 0.914345145] dice[0.998552859 0.841787279] dice[0.999123156 0.81107527] dice[0.997930229 0.729403198] dice[0.998868763 0.817352355] dice[0.998887122 0.757158] INFO:niftynet: training iter 773, loss_1=0.08425620198249817, loss_3=0.061505913734436035, loss=0.10693344473838806, loss_2=0.11561703681945801 (17.931637s) dice[0.998785079 0.849826634] dice[0.99808979 0.643672466] dice[0.999382079 0.891794503] dice[0.999338806 0.889928162] dice[0.998913407 0.842524469] dice[0.999214232 0.88609159] dice[0.999327958 0.828935325] dice[0.9986251 0.623771131] INFO:niftynet: training iter 774, loss_1=0.05488911271095276, loss_3=0.127406507730484, loss=0.13733512163162231, loss_2=0.0683140754699707 (17.662282s) dice[0.998807728 0.844480217] dice[0.998962164 0.786525249] dice[0.998828292 0.862602413] dice[0.999083817 0.887199581] dice[0.998895824 0.875661373] dice[0.998895109 0.646950245] dice[0.997469723 0.823118269] dice[0.998834312 0.778209388] INFO:niftynet: training iter 775, loss_1=0.11989933252334595, loss_3=0.09280616044998169, loss=0.10059207677841187, loss_2=0.06307148933410645 (17.592375s) dice[0.99907434 0.882259667] dice[0.998753607 0.672775328] dice[0.998654485 0.759763181] dice[0.998861194 0.83234185] dice[0.998205841 0.691216588] dice[0.998866439 0.831102192] dice[0.997823536 0.636391103] dice[0.999268293 0.901139855] INFO:niftynet: training iter 776, loss_1=0.11178424954414368, loss_3=0.10259485244750977, loss=0.11634430289268494, loss_2=0.12015223503112793 (18.046092s) dice[0.999037743 0.855432749] dice[0.99912858 0.829918921] dice[0.998443842 0.781199276] dice[0.998908877 0.800278425] dice[0.999115586 0.822870076] dice[0.998139143 0.596815288] dice[0.999427 0.902935] dice[0.998349607 0.77628696] INFO:niftynet: training iter 777, loss_1=0.105292409658432, loss_3=0.14576494693756104, loss=0.08075034618377686, loss_2=0.07912051677703857 (17.291146s) dice[0.99848789 0.806799233] dice[0.998356164 0.514879882] dice[0.99909085 0.808807969] dice[0.99854964 0.823556185] dice[0.996700108 0.548748672] dice[0.998627663 0.806169868] dice[0.9991166 0.87622714] dice[0.998173356 0.450371891] INFO:niftynet: training iter 778, loss_1=0.09249883890151978, loss_3=0.1703692078590393, loss=0.16902774572372437, loss_2=0.16243842244148254 (18.110012s) dice[0.998884857 0.813580513] dice[0.997308552 0.665711164] dice[0.998858094 0.745122731] dice[0.998959422 0.733281791] dice[0.997424424 0.841094673] dice[0.998599529 0.81956774] dice[0.998774529 0.865492344] dice[0.998832881 0.803019166] INFO:niftynet: training iter 779, loss_1=0.13094449043273926, loss_3=0.08582842350006104, loss=0.08347028493881226, loss_2=0.13112872838974 (17.753944s) dice[0.999169767 0.870084465] dice[0.998839617 0.788982689] dice[0.998924255 0.783638954] dice[0.998907447 0.849096715] dice[0.999044299 0.855697036] dice[0.998935878 0.845366597] dice[0.999100566 0.873860836] dice[0.999151111 0.760706544] INFO:niftynet: training iter 780, loss_1=0.07523906230926514, loss_3=0.08573088049888611, loss=0.09179523587226868, loss_2=0.09235817193984985 (18.657161s) dice[0.99913305 0.858698428] dice[0.99920553 0.889677227] dice[0.998176 0.720776] dice[0.99916774 0.879199386] dice[0.99907279 0.89433068] dice[0.998882532 0.844522357] dice[0.998369873 0.791157424] dice[0.998664 0.807993472] INFO:niftynet: training iter 781, loss_1=0.06332144141197205, loss_3=0.1006702184677124, loss=0.10095378756523132, loss_2=0.06579789519309998 (19.367958s) dice[0.995933354 0.68975246] dice[0.996891916 0.631169081] dice[0.998967409 0.808380365] dice[0.999309301 0.869080305] dice[0.996266 0.664928138] dice[0.998609483 0.651793957] dice[0.998983622 0.865839243] dice[0.99912256 0.886405647] INFO:niftynet: training iter 782, loss_1=0.08106565475463867, loss_3=0.1715632677078247, loss=0.06241223216056824, loss_2=0.17210060358047485 (18.313331s) dice[0.999238729 0.773445427] dice[0.999164045 0.865608156] dice[0.998369396 0.725090861] dice[0.998592854 0.689145625] dice[0.999232769 0.860260308] dice[0.999175251 0.882414162] dice[0.998739898 0.864137352] dice[0.999089897 0.786439419] INFO:niftynet: training iter 783, loss_1=0.09063592553138733, loss_3=0.147200345993042, loss=0.08789834380149841, loss_2=0.06472939252853394 (17.160056s) dice[0.999299407 0.921968579] dice[0.999247789 0.852938235] dice[0.999168038 0.854735494] dice[0.999149859 0.8206653] dice[0.999153256 0.867057562] dice[0.999299943 0.90161711] dice[0.999203682 0.76321429] dice[0.998587489 0.583054304] INFO:niftynet: training iter 784, loss_1=0.0566365122795105, loss_3=0.0815703272819519, loss=0.16398504376411438, loss_2=0.058218032121658325 (17.475739s) dice[0.999037862 0.84989804] dice[0.998108506 0.563355505] dice[0.997655392 0.814275086] dice[0.996165037 0.664969623] dice[0.99867475 0.857532084] dice[0.999046803 0.795483828] dice[0.999063 0.813818932] dice[0.998545289 0.829710662] INFO:niftynet: training iter 785, loss_1=0.0873156189918518, loss_3=0.1317337155342102, loss=0.08971554040908813, loss_2=0.14740002155303955 (18.090252s) dice[0.997287035 0.806704164] dice[0.998239458 0.745855272] dice[0.997636199 0.636622] dice[0.999273658 0.841362059] dice[0.998577356 0.783230901] dice[0.999417484 0.906563103] dice[0.997374237 0.659353256] dice[0.998888195 0.835808218] INFO:niftynet: training iter 786, loss_1=0.13127648830413818, loss_3=0.07805278897285461, loss=0.1271440088748932, loss_2=0.11297851800918579 (17.885867s) dice[0.998983502 0.757178962] dice[0.99792093 0.680889964] dice[0.998265207 0.759864867] dice[0.999163806 0.876481295] dice[0.997018099 0.552902579]dice[0.997909367 0.738113225]

dice[0.999334633 0.891451776] dice[0.999182463 0.881037116] INFO:niftynet: training iter 787, loss_1=0.17851418256759644, loss_3=0.09155619144439697, loss=0.05724850296974182, loss_2=0.1412566900253296 (17.644332s) dice[0.9989205 0.812591851] dice[0.999119341 0.854162633] dice[0.998140097 0.687035382] dice[0.998602211 0.820706725] dice[0.998899221 0.841119051] dice[0.998834193 0.821047902] dice[0.999301553 0.906472] dice[0.998696685 0.849454522] INFO:niftynet: training iter 788, loss_1=0.085024893283844, loss_3=0.08380141854286194, loss=0.06151878833770752, loss_2=0.12387889623641968 (18.104424s) dice[0.998903751 0.873893261] dice[0.998025298 0.627869189] dice[0.999450564 0.910455942] dice[0.997805119 0.710610032] dice[0.999320269 0.886082947] dice[0.999186218 0.868290186] dice[0.998152673 0.707360566] dice[0.999000907 0.760571539] INFO:niftynet: training iter 789, loss_1=0.06178009510040283, loss_3=0.12532711029052734, loss=0.13372856378555298, loss_2=0.09541958570480347 (18.511284s) dice[0.998844802 0.832488] dice[0.997439444 0.700680673] dice[0.998765349 0.807986736] dice[0.998508096 0.795926332] dice[0.998834789 0.877351761] dice[0.999198139 0.833608329] dice[0.9955585 0.746010661] dice[0.999200165 0.807436347] INFO:niftynet: training iter 790, loss_1=0.11294856667518616, loss_3=0.0727517306804657, loss=0.09970337152481079, loss_2=0.1176367700099945 (18.481282s) dice[0.999058366 0.796043456] dice[0.999056876 0.794881046] dice[0.999307 0.848919] dice[0.999224544 0.855407715] dice[0.997561753 0.819485843] dice[0.996735394 0.731184185] dice[0.995509624 0.730029821] dice[0.99889493 0.69271481] INFO:niftynet: training iter 791, loss_1=0.11375820636749268, loss_3=0.07428544759750366, loss=0.1457127034664154, loss_2=0.10274004936218262 (20.419521s) dice[0.999253213 0.86436981] dice[0.999360442 0.890769839] dice[0.998951256 0.752830267] dice[0.999327719 0.886632442] dice[0.999057174 0.859494269] dice[0.998964548 0.758947253] dice[0.998926461 0.848215163] dice[0.999244094 0.866025746] INFO:niftynet: training iter 792, loss_1=0.09588417410850525, loss_3=0.06156167387962341, loss=0.07189711928367615, loss_2=0.09056460857391357 (18.602065s) dice[0.995708287 0.36935851] dice[0.998785257 0.799456179] dice[0.999264956 0.86011976] dice[0.998845816 0.822219372] dice[0.998787642 0.728551507] dice[0.998974323 0.801540136] dice[0.999331474 0.88428688] dice[0.998658836 0.858923852] INFO:niftynet: training iter 793, loss_1=0.20917296409606934, loss_3=0.0798875093460083, loss=0.06469973921775818, loss_2=0.11803659796714783 (18.515982s) dice[0.998208 0.722055674] dice[0.999309659 0.912650108] dice[0.996992826 0.733191609] dice[0.999403596 0.888930261] dice[0.99862051 0.762139201] dice[0.999099374 0.883735955] dice[0.999078393 0.818711936] dice[0.999195039 0.894516706] INFO:niftynet: training iter 794, loss_1=0.09537044167518616, loss_3=0.08910122513771057, loss=0.07212448120117188, loss_2=0.0919441282749176 (18.190366s) dice[0.998496771 0.767306924] dice[0.99863553 0.80775255] dice[0.998838723 0.784195304] dice[0.997837603 0.682358086] dice[0.998956 0.846832275] dice[0.999151587 0.823864281] dice[0.999031544 0.856612265] dice[0.999024212 0.826082885] INFO:niftynet: training iter 795, loss_1=0.10695204138755798, loss_3=0.134192556142807, loss=0.07981228828430176, loss_2=0.08279895782470703 (18.098577s) dice[0.999209762 0.857581258] dice[0.999333799 0.909296513] dice[0.997325957 0.775600076] dice[0.998712 0.797314346] dice[0.999100387 0.883993626] dice[0.999106884 0.891318142] dice[0.999349654 0.891111314] dice[0.998141944 0.666022062] INFO:niftynet: training iter 796, loss_1=0.11134374141693115, loss_3=0.10776188969612122, loss=0.058644652366638184, loss_2=0.056620240211486816 (18.283552s) dice[0.997779667 0.664711773] dice[0.998044193 0.718323886] dice[0.998972 0.757911503] dice[0.999223292 0.848881364] dice[0.999094129 0.814946294] dice[0.998957038 0.820219278] dice[0.997488 0.695239544] dice[0.997864425 0.709081] INFO:niftynet: training iter 797, loss_1=0.15528512001037598, loss_3=0.09169581532478333, loss=0.0987529456615448, loss_2=0.15008175373077393 (17.780097s) dice[0.999306321 0.8800354] dice[0.99724406 0.661288083] dice[0.998898268 0.781659961] dice[0.997532606 0.6693753] dice[0.998304307 0.737324059] dice[0.999140263 0.884974957] dice[0.999012947 0.789392769] dice[0.998831 0.805095255] INFO:niftynet: training iter 798, loss_1=0.09506410360336304, loss_3=0.11553153395652771, loss=0.10191702842712402, loss_2=0.1381334662437439 (18.001679s) dice[0.998397231 0.74532938] dice[0.999249101 0.881703317] dice[0.998997927 0.793237209] dice[0.999248266 0.888825715] dice[0.999111712 0.878140748] dice[0.998463 0.775650144] dice[0.999248505 0.875675559] dice[0.99916327 0.868437767] INFO:niftynet: training iter 799, loss_1=0.06436872482299805, loss_3=0.07992273569107056, loss=0.08715859055519104, loss_2=0.09383025765419006 (18.990611s) dice[0.998572886 0.705911577] dice[0.998882353 0.811715245] dice[0.997917354 0.741161883] dice[0.999175906 0.90339613] dice[0.998964131 0.818205476] dice[0.999097347 0.843176484] dice[0.999432266 0.907042444] dice[0.998894393 0.842778921] INFO:niftynet: training iter 800, loss_1=0.12122946977615356, loss_3=0.08513912558555603, loss=0.06296297907829285, loss_2=0.08958718180656433 (18.076463s) INFO:niftynet: iter 800 saved: /home/user/NiftyNet/models/kidneyFATMAP/models/model.ckpt dice[0.999246418 0.878806233] dice[0.999370635 0.847950697] dice[0.998951316 0.884684324] dice[0.998814344 0.664962351] dice[0.996695757 0.750306487] dice[0.99947989 0.903958499] dice[0.999081254 0.814211726] dice[0.998561382 0.782501161] INFO:niftynet: training iter 801, loss_1=0.06865650415420532, loss_3=0.10141110420227051, loss=0.11314693093299866, loss_2=0.08738985657691956 (16.284455s) dice[0.999156475 0.749034584] dice[0.999214888 0.902917206] dice[0.998417079 0.823383152] dice[0.998691201 0.675047755] dice[0.998962164 0.781338334] dice[0.99820751 0.687470853] dice[0.998184741 0.671031475] dice[0.997478604 0.689138591] INFO:niftynet: training iter 802, loss_1=0.08741921186447144, loss_3=0.12611520290374756, loss=0.16104167699813843, loss_2=0.13350528478622437 (18.633225s) dice[0.999209821 0.905517817] dice[0.998835087 0.798377752] dice[0.999099493 0.757659495] dice[0.999176264 0.830940306] dice[0.996403754 0.702927291] dice[0.999360919 0.85708046] dice[0.998399675 0.746201] dice[0.999193132 0.774740279] INFO:niftynet: training iter 803, loss_1=0.07451486587524414, loss_3=0.10328114032745361, loss=0.12036648392677307, loss_2=0.11105689406394958 (18.451889s) dice[0.998883963 0.804057837] dice[0.999111354 0.902694464] dice[0.999170721 0.875249386] dice[0.998985469 0.783867717] dice[0.999177456 0.86492449] dice[0.998999536 0.827488959] dice[0.998987913 0.716080368] dice[0.999182761 0.809313595] INFO:niftynet: training iter 804, loss_1=0.07735240459442139, loss_3=0.07381308078765869, loss=0.11910882592201233, loss_2=0.08568167686462402 (18.507700s) dice[0.998876929 0.873737216] dice[0.999102354 0.757357478] dice[0.998867691 0.880552709] dice[0.999509752 0.927846372] dice[0.998198628 0.55072242] dice[0.999029696 0.84470439] dice[0.997959733 0.751917243] dice[0.999228358 0.863187671] INFO:niftynet: training iter 805, loss_1=0.04830586910247803, loss_3=0.15183621644973755, loss=0.096926748752594, loss_2=0.09273150563240051 (18.130037s) dice[0.999011517 0.851385534] dice[0.998809934 0.817511082] dice[0.999080718 0.887915075] dice[0.997166693 0.791568] dice[0.999315619 0.824824691] dice[0.998882949 0.8871966] dice[0.99902612 0.73784107] dice[0.998209894 0.72495085] INFO:niftynet: training iter 806, loss_1=0.07244503498077393, loss_3=0.08332046866416931, loss=0.13499301671981812, loss_2=0.08106738328933716 (70.254756s) dice[0.999293506 0.906728625] dice[0.997551739 0.685425341] dice[0.999138236 0.796588123] dice[0.998582721 0.693142414] dice[0.998540044 0.78256464] dice[0.999281049 0.752714276] dice[0.996584058 0.752369285] dice[0.999103963 0.818300486] INFO:niftynet: training iter 807, loss_1=0.10275021195411682, loss_3=0.11672499775886536, loss=0.12813714146614075, loss_2=0.1084105372428894 (17.827444s) dice[0.998504877 0.657532752] dice[0.998570323 0.771540582] dice[0.999156475 0.85422492] dice[0.999077618 0.891164362] dice[0.997616887 0.687216401] dice[0.99760288 0.708268464] dice[0.998313129 0.762420237] dice[0.999059379 0.860583246] INFO:niftynet: training iter 808, loss_1=0.1434628963470459, loss_3=0.06409415602684021, loss=0.15232384204864502, loss_2=0.09490600228309631 (18.183476s) dice[0.998766303 0.858834] dice[0.99887228 0.83725667] dice[0.998379588 0.789473295] dice[0.999232769 0.8592] dice[0.998955071 0.848826] dice[0.999195874 0.823718309] dice[0.997524083 0.67549783] dice[0.998692691 0.791791379] INFO:niftynet: training iter 809, loss_1=0.07656767964363098, loss_3=0.08232620358467102, loss=0.08842858672142029, loss_2=0.13412350416183472 (18.108779s) dice[0.999244392 0.917677402] dice[0.999042094 0.734405756] dice[0.998732 0.84834379] dice[0.999027491 0.857512653] dice[0.99898386 0.836057] dice[0.999076426 0.878753304] dice[0.997966 0.721107125] dice[0.999148726 0.822494507] INFO:niftynet: training iter 810, loss_1=0.07409602403640747, loss_3=0.08740758895874023, loss=0.1148209273815155, loss_2=0.07178235054016113 (19.480716s) dice[0.998601854 0.754878521] dice[0.997769415 0.776425779] dice[0.997856617 0.811430395] dice[0.998918355 0.786089897] dice[0.998967052 0.776376188] dice[0.999202669 0.882244587] dice[0.999277949 0.895085752] dice[0.998737574 0.79052335] INFO:niftynet: training iter 811, loss_1=0.10142618417739868, loss_3=0.08580237627029419, loss=0.11808109283447266, loss_2=0.07909387350082397 (19.009814s) dice[0.998585939 0.634266] dice[0.999101937 0.849171519] dice[0.999029636 0.836123705] dice[0.998726249 0.791494608] dice[0.997790396 0.69957906] dice[0.998467088 0.751628] dice[0.997947156 0.792439818] dice[0.999058485 0.848248482] INFO:niftynet: training iter 812, loss_1=0.1297186315059662, loss_3=0.13813385367393494, loss=0.09057652950286865, loss_2=0.09365645051002502 (17.991214s) dice[0.998392165 0.775467634] dice[0.99894 0.784175098] dice[0.999099255 0.884887576] dice[0.998933911 0.83637917] dice[0.998096228 0.696903944] dice[0.99916631 0.82062] dice[0.998880565 0.705833137] dice[0.999294162 0.879106939] INFO:niftynet: training iter 813, loss_1=0.07017502188682556, loss_3=0.10422131419181824, loss=0.12130337953567505, loss_2=0.1107562780380249 (18.065166s) dice[0.999038 0.84895283] dice[0.998883545 0.770899415] dice[0.998883843 0.812731504] dice[0.999336421 0.775385678] dice[0.998515546 0.789560914] dice[0.999161541 0.834502518] dice[0.998441637 0.782346427] dice[0.998785198 0.810500741] INFO:niftynet: training iter 814, loss_1=0.09555655717849731, loss_3=0.09456488490104675, loss=0.10248151421546936, loss_2=0.10341563820838928 (25.169132s) dice[0.999339342 0.899605215] dice[0.998528302 0.827584267] dice[0.998963 0.827533185] dice[0.99919 0.90187186] dice[0.999255121 0.892690837] dice[0.999331653 0.911310494] dice[0.999120414 0.862118721] dice[0.998368204 0.641907752] INFO:niftynet: training iter 815, loss_1=0.06873571872711182, loss_3=0.0493529736995697, loss=0.06811052560806274, loss_2=0.12462121248245239 (175.856531s) dice[0.999150395 0.71276027] dice[0.999231339 0.754778922] dice[0.998938322 0.791591883] dice[0.998945236 0.884830058] dice[0.998637915 0.808099031] dice[0.999207079 0.892716825] dice[0.996596515 0.542882681] dice[0.998706162 0.792768598] INFO:niftynet: training iter 816, loss_1=0.13351976871490479, loss_3=0.08142364025115967, loss=0.16726148128509521, loss_2=0.07533478736877441 (174.863183s) dice[0.99914813 0.891175807] dice[0.998251438 0.441301167] dice[0.999267638 0.884431601] dice[0.999168932 0.860609949] dice[0.997135818 0.840674043] dice[0.998726904 0.824277639] dice[0.999032378 0.803967714] dice[0.999216795 0.885944664] INFO:niftynet: training iter 817, loss_1=0.08479642868041992, loss_3=0.16753089427947998, loss=0.07795962691307068, loss_2=0.06413048505783081 (180.074614s) dice[0.999215484 0.839683533] dice[0.999264479 0.88214016] dice[0.998635709 0.814906] dice[0.999193132 0.825168133] dice[0.998795807 0.858208358] dice[0.99866879 0.827854633] dice[0.999003887 0.776388764] dice[0.997147918 0.826600909] INFO:niftynet: training iter 818, loss_1=0.07911810278892517, loss_3=0.09052425622940063, loss=0.1002146303653717, loss_2=0.06992408633232117 (188.373372s) dice[0.998854816 0.838270545] dice[0.999382138 0.89610064] dice[0.998848557 0.893175364] dice[0.999312937 0.911725581] dice[0.99893707 0.850638688] dice[0.998793066 0.834989905] dice[0.998565435 0.851995111] dice[0.998875499 0.823658466] INFO:niftynet: training iter 819, loss_1=0.07916033267974854, loss_3=0.04923439025878906, loss=0.08172637224197388, loss_2=0.06684798002243042 (170.028213s) dice[0.999038577 0.856797695] dice[0.998951 0.845436573] dice[0.997076571 0.768554568] dice[0.999382257 0.858093619] dice[0.998627067 0.808047116] dice[0.998792768 0.679201305] dice[0.998345196 0.766118586] dice[0.997752666 0.750653863] INFO:niftynet: training iter 820, loss_1=0.07494404911994934, loss_3=0.12883293628692627, loss=0.12178242206573486, loss_2=0.09422323107719421 (210.300999s) dice[0.999064505 0.89303112] dice[0.997442663 0.80186975] dice[0.995883524 0.677708924] dice[0.998786867 0.803845763] dice[0.995519817 0.679145753] dice[0.998428524 0.771906435] dice[0.998268723 0.782214522] dice[0.999329925 0.905648053] INFO:niftynet: training iter 821, loss_1=0.07714802026748657, loss_3=0.13094374537467957, loss=0.07863467931747437, loss_2=0.1387498676776886 (159.599277s) dice[0.998250246 0.691570938] dice[0.99820286 0.747844934] dice[0.998915195 0.845279634] dice[0.998958886 0.827322066] dice[0.998560488 0.655520558] dice[0.997866631 0.70483768] dice[0.998702288 0.649757385] dice[0.99913013 0.8872208] INFO:niftynet: training iter 822, loss_1=0.14103275537490845, loss_3=0.08238103985786438, loss=0.11629736423492432, loss_2=0.1608036458492279 (168.748594s) dice[0.998749256 0.831157565] dice[0.997574151 0.625232399] dice[0.99897325 0.776423573] dice[0.998754799 0.883586764] dice[0.999199331 0.783414543] dice[0.998805165 0.852551818] dice[0.999242365 0.868945] dice[0.995540619 0.638986] INFO:niftynet: training iter 823, loss_1=0.09150728583335876, loss_3=0.1368216574192047, loss=0.12432149052619934, loss_2=0.08556538820266724 (204.547595s) dice[0.999102116 0.813156784] dice[0.999006331 0.798154354] dice[0.998948693 0.85219866] dice[0.998947382 0.822331309] dice[0.999020875 0.763944] dice[0.997401893 0.851598501] dice[0.998697 0.720143139] dice[0.999310255 0.873917341] INFO:niftynet: training iter 824, loss_1=0.097645103931427, loss_3=0.08189347386360168, loss=0.10198307037353516, loss_2=0.09700867533683777 (169.025973s) dice[0.999154508 0.859289408] dice[0.999177933 0.854192138] dice[0.999275267 0.930871725] dice[0.99925375 0.852050543] dice[0.998861551 0.835214674] dice[0.999206424 0.87794137] dice[0.999021053 0.787359655] dice[0.998161852 0.754743218] INFO:niftynet: training iter 825, loss_1=0.07219401001930237, loss_3=0.05463719367980957, loss=0.11517852544784546, loss_2=0.07204648852348328 (179.602060s) dice[0.998901844 0.826087654] dice[0.998556733 0.817759573] dice[0.998804331 0.836882591] dice[0.998307586 0.732782841] dice[0.999073625 0.815399945] dice[0.998506427 0.783889353] dice[0.998326182 0.781557381] dice[0.99780494 0.713834405] INFO:niftynet: training iter 826, loss_1=0.08967351913452148, loss_3=0.1083056628704071, loss=0.127119243144989, loss_2=0.10078263282775879 (176.189953s) dice[0.998840094 0.827722] dice[0.999081969 0.877530694] dice[0.999385417 0.923281] dice[0.999393 0.91587323] dice[0.999090433 0.872749329] dice[0.998924136 0.839632] dice[0.999123037 0.890273869] dice[0.998086572 0.737651169] INFO:niftynet: training iter 827, loss_1=0.040516823530197144, loss_3=0.0937163233757019, loss=0.07240104675292969, loss_2=0.07420629262924194 (161.692649s) dice[0.999330342 0.894789457] dice[0.998345852 0.739587784] dice[0.998545706 0.817015469] dice[0.999433696 0.90813] dice[0.998939872 0.882723212] dice[0.999049902 0.859374] dice[0.998163939 0.69251889] dice[0.999127328 0.759948552] INFO:niftynet: training iter 828, loss_1=0.06921878457069397, loss_3=0.09198665618896484, loss=0.06497824192047119, loss_2=0.13756030797958374 (152.279303s) dice[0.999068 0.7330212] dice[0.998360932 0.78488] dice[0.998857737 0.707895815] dice[0.998591185 0.796260357] dice[0.999136329 0.868546188] dice[0.998762071 0.753413439] dice[0.99923104 0.881886423] dice[0.999067128 0.857648] INFO:niftynet: training iter 829, loss_1=0.09503549337387085, loss_3=0.12459874153137207, loss=0.0655418336391449, loss_2=0.12116745114326477 (175.519339s) dice[0.999275565 0.796621442] dice[0.999245763 0.834914446] dice[0.999178469 0.858065069] dice[0.999284804 0.893907607] dice[0.999564767 0.925430894] dice[0.99755621 0.824233711] dice[0.999002 0.857018173] dice[0.999437451 0.910869539] INFO:niftynet: training iter 830, loss_1=0.0924856960773468, loss_3=0.0623910129070282, loss=0.06330358982086182, loss_2=0.058418214321136475 (204.802351s) dice[0.996924639 0.759249032] dice[0.998072684 0.69348067] dice[0.999132693 0.763461232] dice[0.999120176 0.804314733] dice[0.99923712 0.841565728] dice[0.999299526 0.877881706] dice[0.99729228 0.742863178] dice[0.998897433 0.838724136] INFO:niftynet: training iter 831, loss_1=0.10849279165267944, loss_3=0.10555574297904968, loss=0.13806825876235962, loss_2=0.07050395011901855 (182.083290s) dice[0.999273956 0.713439882] dice[0.99848032 0.754144847] dice[0.998384297 0.744223297] dice[0.999147892 0.642385125] dice[0.999070287 0.874530673] dice[0.998903275 0.840482593] dice[0.999005079 0.881147802] dice[0.999333084 0.894290507] INFO:niftynet: training iter 832, loss_1=0.13366523385047913, loss_3=0.056555867195129395, loss=0.1539648473262787, loss_2=0.07175329327583313 (197.662166s) dice[0.998058617 0.745615602] dice[0.998703241 0.859153748] dice[0.998521924 0.7402125] dice[0.998456836 0.732963443] dice[0.999046862 0.800039589] dice[0.998903632 0.847270906] dice[0.999309242 0.918243468] dice[0.999212623 0.877607048] INFO:niftynet: training iter 833, loss_1=0.09961718320846558, loss_3=0.08868476748466492, loss=0.051406919956207275, loss_2=0.1324613094329834 (177.723567s) dice[0.999197543 0.893241584] dice[0.999247551 0.918876708] dice[0.999241948 0.787608087] dice[0.99890554 0.849290609] dice[0.999061048 0.860744178] dice[0.999287546 0.89152205] dice[0.998732269 0.855800748] dice[0.997983515 0.707977295] INFO:niftynet: training iter 834, loss_1=0.09123846888542175, loss_3=0.0473591685295105, loss=0.10987657308578491, loss_2=0.062346309423446655 (177.559788s) dice[0.999041319 0.855875492] dice[0.998652637 0.803993344] dice[0.998140395 0.7203] dice[0.995666564 0.683935642] dice[0.997678816 0.725188375] dice[0.998880684 0.768047392] dice[0.998920739 0.762083709] dice[0.997000813 0.830238163] INFO:niftynet: training iter 835, loss_1=0.10293912887573242, loss_3=0.0856093168258667, loss=0.15048936009407043, loss_2=0.12755116820335388 (178.043887s) dice[0.999155 0.919141591] dice[0.998970747 0.870162904] dice[0.999153078 0.868985534] dice[0.995540798 0.658684433] dice[0.998614073 0.815385759] dice[0.999112666 0.903447568] dice[0.998890281 0.836852789] dice[0.997510493 0.716025054] INFO:niftynet: training iter 836, loss_1=0.053142428398132324, loss_3=0.11940905451774597, loss=0.1126803457736969, loss_2=0.07085996866226196 (190.968411s) dice[0.998665452 0.839839637] dice[0.998503387 0.805652738] dice[0.998939931 0.869115472] dice[0.999125719 0.879227936] dice[0.999232531 0.908776104] dice[0.999183893 0.908779144] dice[0.997855723 0.727254629] dice[0.998946965 0.802785158] INFO:niftynet: training iter 837, loss_1=0.06339776515960693, loss_3=0.04600709676742554, loss=0.11828941106796265, loss_2=0.08933472633361816 (177.930839s) dice[0.999086678 0.878862381] dice[0.998563111 0.809961259] dice[0.999177694 0.890796721] dice[0.999391496 0.905472457] dice[0.999301314 0.915053129] dice[0.999156594 0.882393062] dice[0.998924613 0.826779306] dice[0.999196589 0.849367917] INFO:niftynet: training iter 838, loss_1=0.051290422677993774, loss_3=0.07838165760040283, loss=0.08143290877342224, loss_2=0.05102398991584778 (173.656646s) dice[0.998882 0.839290142] dice[0.999252856 0.902179718] dice[0.998945773 0.793938637] dice[0.999470413 0.925628304] dice[0.998895407 0.841233] dice[0.998953104 0.896608591] dice[0.998768568 0.690117419] dice[0.999061465 0.882411063] INFO:niftynet: training iter 839, loss_1=0.06607747077941895, loss_3=0.07050424814224243, loss=0.10741037130355835, loss_2=0.0650988221168518 (161.016578s) dice[0.99922967 0.833344221] dice[0.998918056 0.842465758] dice[0.999209 0.884321034] dice[0.998965204 0.814469516] dice[0.999517 0.931089282] dice[0.999071598 0.817567289] dice[0.998708248 0.729131281] dice[0.998715281 0.86618048] INFO:niftynet: training iter 840, loss_1=0.08151057362556458, loss_3=0.07575881481170654, loss=0.10181617736816406, loss_2=0.06318867206573486 (188.745945s) dice[0.999319792 0.902410269] dice[0.999187827 0.83646059] dice[0.998340666 0.727536917] dice[0.999024749 0.896269858] dice[0.998917699 0.763923049] dice[0.99892664 0.882599652] dice[0.997511744 0.834699631] dice[0.998322308 0.709377766] INFO:niftynet: training iter 841, loss_1=0.06565538048744202, loss_3=0.0947069525718689, loss=0.08890825510025024, loss_2=0.11502215266227722 (183.437738s) dice[0.99923 0.911455154] dice[0.999322712 0.791464865] dice[0.999317169 0.884772301] dice[0.999276817 0.867951] dice[0.99926877 0.913750231] dice[0.999329209 0.877723336] dice[0.999418259 0.902406037] dice[0.999212205 0.86261] INFO:niftynet: training iter 842, loss_1=0.06217068433761597, loss_3=0.07463181018829346, loss=0.05908840894699097, loss_2=0.05248212814331055 (161.304433s) dice[0.997815728 0.791415274] dice[0.999282658 0.859958589] dice[0.999245524 0.88328737] dice[0.998564541 0.776967585] dice[0.99833411 0.750117421] dice[0.99938941 0.897067487] dice[0.999253809 0.854245663] dice[0.999480665 0.924531] INFO:niftynet: training iter 843, loss_1=0.08548375964164734, loss_3=0.08788192272186279, loss=0.08877289295196533, loss_2=0.055622220039367676 (194.123030s) dice[0.998684525 0.800471127] dice[0.999214709 0.800583959] dice[0.99846 0.526221395] dice[0.998964429 0.757168233] dice[0.998024881 0.535796463] dice[0.998895407 0.88186276] dice[0.997550309 0.777066767] dice[0.998673439 0.816589415] INFO:niftynet: training iter 844, loss_1=0.1797965168952942, loss_3=0.100261390209198, loss=0.10253003239631653, loss_2=0.14635512232780457 (168.159131s) dice[0.99745965 0.834942579] dice[0.998631895 0.768416047] dice[0.99914676 0.809749603] dice[0.998140693 0.747259498] dice[0.998879373 0.851335824] dice[0.999012 0.800345898] dice[0.998859644 0.832990706] dice[0.999227762 0.889077604] INFO:niftynet: training iter 845, loss_1=0.11142587661743164, loss_3=0.10013747215270996, loss=0.0699610710144043, loss_2=0.08760672807693481 (182.150668s) dice[0.998635 0.850965083] dice[0.998777 0.883789599] dice[0.999329567 0.906600893] dice[0.999092 0.843858838] dice[0.997213602 0.575962126] dice[0.999430776 0.882225394] dice[0.998965561 0.801460803] dice[0.999122143 0.874137282] INFO:niftynet: training iter 846, loss_1=0.08157855272293091, loss_3=0.06695833802223206, loss=0.13629204034805298, loss_2=0.06277966499328613 (182.751374s) dice[0.999257505 0.848303616] dice[0.998809814 0.831159413] dice[0.999227166 0.834173858] dice[0.999275327 0.86385] dice[0.999294519 0.893285334] dice[0.999211252 0.922414422] dice[0.998991 0.832137048] dice[0.997533321 0.793478847] INFO:niftynet: training iter 847, loss_1=0.08061742782592773, loss_3=0.04644864797592163, loss=0.09446492791175842, loss_2=0.07586842775344849 (185.722805s)

Sandv avatar Aug 22 '18 12:08 Sandv

I am doing another run without tensorboard - maybe that has to do with it... I will update tomorrow. Thank you very much.

Update: Same results with tensorboard off. At the time of slow down the CPU utilization is low and no swapping occurs. Can't fully make sense of this...

Sandv avatar Aug 22 '18 16:08 Sandv

@wyli Did you have a look at this after re-opening it at the time? Is it an on-going issue or could this ticket be closed? Cheers!

mmodat avatar May 23 '19 16:05 mmodat

It still was relatively slow. But if other people don't complain I guess it is some local issue and it would be ok to close... thank you very much! Veit

On Thu, May 23, 2019 at 12:00 PM Marc Modat [email protected] wrote:

@wyli https://github.com/wyli Did you have a look at this after re-opening it at the time? Is it an on-going issue or could this ticket be closed? Cheers!

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/NifTK/NiftyNet/issues/47?email_source=notifications&email_token=ADYOUQK66SLPICHRJOZDYWDPW25QZA5CNFSM4EPJ4G42YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWCWBLY#issuecomment-495280303, or mute the thread https://github.com/notifications/unsubscribe-auth/ADYOUQLG335QOIV4YLIKQYDPW25QZANCNFSM4EPJ4G4Q .

Sandv avatar May 23 '19 16:05 Sandv