Brats2019 icon indicating copy to clipboard operation
Brats2019 copied to clipboard

Still a big gap with your best result.(Training average dice: [0.787, 0.546, 0.446] vs [0.915, 0.83, 0.791])

Open Lightning980729 opened this issue 5 years ago • 24 comments

Hello, Sorry to bother you. I try your advise to train with only HGG data. The result truely improved from the last one. Average dice increased by 10 points. But I cannot reappear your best result and there is still a big gap. After tranning 20000 epochs , the train.log file as follow

[WT, TC, ET]:  average dice: [0.592, 0.323, 0.082]  mean average dice : 0.3323333333333333 average sensitivity: [0.969, 0.973, 0.085]  mean average sensitivity : 0.6756666666666667
[WT, TC, ET]:  average dice: [0.422, 0.365, 0.17]  mean average dice : 0.319 average sensitivity: [0.982, 0.965, 0.247]  mean average sensitivity : 0.7313333333333333
[WT, TC, ET]:  average dice: [0.692, 0.433, 0.242]  mean average dice : 0.45566666666666666 average sensitivity: [0.969, 0.969, 0.246]  mean average sensitivity : 0.7280000000000001
[WT, TC, ET]:  average dice: [0.45, 0.344, 0.014]  mean average dice : 0.26933333333333337 average sensitivity: [0.991, 0.979, 0.022]  mean average sensitivity : 0.664
[WT, TC, ET]:  average dice: [0.682, 0.407, 0.341]  mean average dice : 0.4766666666666666 average sensitivity: [0.979, 0.984, 0.413]  mean average sensitivity : 0.7919999999999999
[WT, TC, ET]:  average dice: [0.598, 0.389, 0.283]  mean average dice : 0.42333333333333334 average sensitivity: [0.983, 0.984, 0.322]  mean average sensitivity : 0.763
[WT, TC, ET]:  average dice: [0.679, 0.438, 0.252]  mean average dice : 0.4563333333333333 average sensitivity: [0.983, 0.977, 0.254]  mean average sensitivity : 0.738
[WT, TC, ET]:  average dice: [0.678, 0.439, 0.262]  mean average dice : 0.45966666666666667 average sensitivity: [0.985, 0.982, 0.274]  mean average sensitivity : 0.747
[WT, TC, ET]:  average dice: [0.691, 0.52, 0.097]  mean average dice : 0.43599999999999994 average sensitivity: [0.98, 0.978, 0.066]  mean average sensitivity : 0.6746666666666666
[WT, TC, ET]:  average dice: [0.634, 0.314, 0.349]  mean average dice : 0.4323333333333333 average sensitivity: [0.993, 0.998, 0.47]  mean average sensitivity : 0.8203333333333335
[WT, TC, ET]:  average dice: [0.675, 0.473, 0.034]  mean average dice : 0.3940000000000001 average sensitivity: [0.987, 0.991, 0.022]  mean average sensitivity : 0.6666666666666666
[WT, TC, ET]:  average dice: [0.673, 0.499, 0.39]  mean average dice : 0.5206666666666667 average sensitivity: [0.974, 0.98, 0.406]  mean average sensitivity : 0.7866666666666666
[WT, TC, ET]:  average dice: [0.678, 0.423, 0.261]  mean average dice : 0.454 average sensitivity: [0.988, 0.994, 0.307]  mean average sensitivity : 0.763
[WT, TC, ET]:  average dice: [0.769, 0.513, 0.349]  mean average dice : 0.5436666666666666 average sensitivity: [0.983, 0.992, 0.346]  mean average sensitivity : 0.7736666666666667
[WT, TC, ET]:  average dice: [0.717, 0.501, 0.336]  mean average dice : 0.518 average sensitivity: [0.989, 0.99, 0.314]  mean average sensitivity : 0.7643333333333334
[WT, TC, ET]:  average dice: [0.787, 0.546, 0.446]  mean average dice : 0.5930000000000001 average sensitivity: [0.982, 0.99, 0.41]  mean average sensitivity : 0.794
[WT, TC, ET]:  average dice: [0.671, 0.572, 0.389]  mean average dice : 0.5439999999999999 average sensitivity: [0.982, 0.978, 0.364]  mean average sensitivity : 0.7746666666666666
[WT, TC, ET]:  average dice: [0.745, 0.573, 0.276]  mean average dice : 0.5313333333333333 average sensitivity: [0.982, 0.986, 0.223]  mean average sensitivity : 0.7303333333333333
[WT, TC, ET]:  average dice: [0.783, 0.598, 0.336]  mean average dice : 0.5723333333333334 average sensitivity: [0.983, 0.989, 0.277]  mean average sensitivity : 0.7496666666666667
[WT, TC, ET]:  average dice: [0.76, 0.642, 0.379]  mean average dice : 0.5936666666666667 average sensitivity: [0.985, 0.98, 0.33]  mean average sensitivity : 0.765

As you can see, the best result is [WT, TC, ET]: average dice: [0.787, 0.546, 0.446] Do you think it might be the problem with the parameters that you set on the "parameters.ini" ?Or is there any other augment for trainning data. Because of the limited computing resources, I didn't do more experiments.

Lightning980729 avatar Nov 30 '19 10:11 Lightning980729

Could I ask for the checkpoints file to do some test if you have saved these files? I will be appreciate if you could send me the download link. Thanks.

Lightning980729 avatar Nov 30 '19 10:11 Lightning980729

May be you can decrease the learning rate when the dice did't improve, and see how it goes, Sorry for that request of chckpoint file, its an result of a team, so it not appropriate to release right now. I believe you will get the reuslt from the code.

JohnleeHIT avatar Dec 01 '19 00:12 JohnleeHIT

Have you reached the author's results now?

angledick avatar Dec 19 '19 16:12 angledick

I also didn't get the result as the author, so what's wrong with it? How to improve?

zwlshine avatar Dec 22 '19 02:12 zwlshine

No I cannot get the result as well. And I still don't know the reason.

Lightning980729 avatar Dec 22 '19 02:12 Lightning980729

Since some of you guys can not get the result, I will retrain the code again these days, Please wiat for my updatas, thanks for your patience.

JohnleeHIT avatar Dec 22 '19 04:12 JohnleeHIT

Restriction of GPU resource, we must use patch volume, can we resize whole volume to a small size, then training with this? Can dice be promoted?

zwlshine avatar Dec 23 '19 05:12 zwlshine

You'd better no to do that, resize the volume means resize the label at the same time, it will cause a lot of problems

JohnleeHIT avatar Dec 23 '19 14:12 JohnleeHIT

You'd better no to do that, resize the volume means resize the label at the same time, it will cause a lot of problems

Yes,you are right, this task is multi-class classification, it's different from binary classification. I even did this method for binary classification, it really can improve dice.

zwlshine avatar Dec 24 '19 00:12 zwlshine

@Lightning980729 @zwlshine I found some mistake in my code, and i have upload the new version, you should now get the right results. Sorry for the mistake. Besides, the result i show is the result of model ensemble, so the result of a single model would be slightly inferior to the result.

JohnleeHIT avatar Dec 24 '19 14:12 JohnleeHIT

@Lightning980729 @zwlshine I found some mistake in my code, and i have upload the new version, you should now get the right results. Sorry for the mistake. Besides, the result i show is the result of model ensemble, so the result of a single model would be slightly inferior to the result.

Hello,I have read your new code, I found only one function changed named softmax_weighted_loss, you reuse one line code: gt = produce_mask_background(gt, softmaxpred, self.fg_ratio,self.bg_ratio). Except for this one, no others changes. I want to make sure with you!

And I have a question, in file parameters.ini about the learning rate. At first, lr=0.001, when reach plateu decrease to 0.0005. Where to do this in your code? I found in function conv3d and Deconv3d in models.py, both do "slim.l2_regularizer(0.0005)", does this change the learning rate from 0.001 to 0.0005?

Thank you very much! I am a new learner, I'm sorry for so many questions, but your code is great, especially your model combination logic!

zwlshine avatar Dec 26 '19 07:12 zwlshine

@Lightning980729 @zwlshine I found some mistake in my code, and i have upload the new version, you should now get the right results. Sorry for the mistake. Besides, the result i show is the result of model ensemble, so the result of a single model would be slightly inferior to the result.

Hello,I have read your new code, I found only one function changed named softmax_weighted_loss, you reuse one line code: gt = produce_mask_background(gt, softmaxpred, self.fg_ratio,self.bg_ratio). Except for this one, no others changes. I want to make sure with you!

And I have a question, in file parameters.ini about the learning rate. At first, lr=0.001, when reach plateu decrease to 0.0005. Where to do this in your code? I found in function conv3d and Deconv3d in models.py, both do "slim.l2_regularizer(0.0005)", does this change the learning rate from 0.001 to 0.0005?

Thank you very much! I am a new learner, I'm sorry for so many questions, but your code is great, especially your model combination logic!

You'd better git clone the latest version, several places have changed. For learning rate, I Just change the learning rate in the config file when the dice did't incease.

JohnleeHIT avatar Dec 26 '19 07:12 JohnleeHIT

I'm sure the only worked change is in the function softmax_weighted_loss. Some other changes like: fractal_net in models.py, and self.is_global_path in operations.py are all commented ones.

zwlshine avatar Dec 27 '19 03:12 zwlshine

Hello,I can't get the best result. My best reslt is average dice: [0.603,0.62,0.584]. Do you know how solve it? Thanks!

zhangjing1170 avatar Dec 27 '19 08:12 zhangjing1170

Hello,I can't get the best result. My best reslt is average dice: [0.603,0.62,0.584]. Do you know how solve it? Thanks!

When only use HGG for training, I can get almost the same dice of WT and TC, but ET is lower, ET dice is 0.4.

zwlshine avatar Dec 28 '19 02:12 zwlshine

Hello,I can't get the best result. My best reslt is average dice: [0.603,0.62,0.584]. Do you know how solve it? Thanks!

When only use HGG for training, I can get almost the same dice of WT and TC, but ET is lower, ET dice is 0.4.

I just git clone the latest version, so I am still training. Now I can't answer your question. It takes time. I will let you known when I finish it.

Lightning980729 avatar Dec 28 '19 02:12 Lightning980729

@JohnleeHIT After about 30000 epoches of training, here is my result in train.log file: image As you can see: the WT part of the dice is quite close to the state-of-art, but the TC and ET parts have a long way to go. I've changed the learning-rate when dice does not improve.

siyuanSsun avatar Jan 07 '20 05:01 siyuanSsun

@siyuanSsun Did you only use HGG for training? what's the learning rate?

zwlshine avatar Jan 09 '20 02:01 zwlshine

@siyuanSsun Did you only use HGG for training? what's the learning rate?

@zwlshine I used both HGG and LGG for training. However I randomly chose part of the data as training dataset and the rest as test dataset. The first 20000 epoches I used 0.0005 as learning rate and then changed it to 0.0001 for the rest of training.

siyuanSsun avatar Jan 09 '20 02:01 siyuanSsun

@siyuanSsun Did you only use HGG for training? what's the learning rate?

@zwlshine I used both HGG and LGG for training. However I randomly chose part of the data as training dataset and the rest as test dataset. The first 20000 epoches I used 0.0005 as learning rate and then changed it to 0.0001 for the rest of training.

Thank you very much! About change learning rate, do you mean when reach the 20000 epoch, you stop the process, then change learning_rate in parameters.ini file, then load the 20000 epoch's checkpoint as pre_weight for the rest of training?

zwlshine avatar Jan 09 '20 05:01 zwlshine

@siyuanSsun Did you only use HGG for training? what's the learning rate?

@zwlshine I used both HGG and LGG for training. However I randomly chose part of the data as training dataset and the rest as test dataset. The first 20000 epoches I used 0.0005 as learning rate and then changed it to 0.0001 for the rest of training.

Thank you very much! About change learning rate, do you mean when reach the 20000 epoch, you stop the process, then change learning_rate in parameters.ini file, then load the 20000 epoch's checkpoint as pre_weight for the rest of training?

@zwlshine exactly

siyuanSsun avatar Jan 09 '20 08:01 siyuanSsun

@JohnleeHIT After about 30000 epoches of training, here is my result in train.log file: image As you can see: the WT part of the dice is quite close to the state-of-art, but the TC and ET parts have a long way to go. I've changed the learning-rate when dice does not improve.

Try training only with HGG data.

JohnleeHIT avatar Jan 13 '20 07:01 JohnleeHIT

@JohnleeHIT Hi, I also get the same result which is a gap with your accuracy. Is there any diffenence of the code with your own code?

Med-Process avatar Mar 19 '20 06:03 Med-Process

Hello, Sorry to bother you. I try your advise to train with only HGG data. The result truely improved from the last one. Average dice increased by 10 points. But I cannot reappear your best result and there is still a big gap. After tranning 20000 epochs , the train.log file as follow

[WT, TC, ET]:  average dice: [0.592, 0.323, 0.082]  mean average dice : 0.3323333333333333 average sensitivity: [0.969, 0.973, 0.085]  mean average sensitivity : 0.6756666666666667
[WT, TC, ET]:  average dice: [0.422, 0.365, 0.17]  mean average dice : 0.319 average sensitivity: [0.982, 0.965, 0.247]  mean average sensitivity : 0.7313333333333333
[WT, TC, ET]:  average dice: [0.692, 0.433, 0.242]  mean average dice : 0.45566666666666666 average sensitivity: [0.969, 0.969, 0.246]  mean average sensitivity : 0.7280000000000001
[WT, TC, ET]:  average dice: [0.45, 0.344, 0.014]  mean average dice : 0.26933333333333337 average sensitivity: [0.991, 0.979, 0.022]  mean average sensitivity : 0.664
[WT, TC, ET]:  average dice: [0.682, 0.407, 0.341]  mean average dice : 0.4766666666666666 average sensitivity: [0.979, 0.984, 0.413]  mean average sensitivity : 0.7919999999999999
[WT, TC, ET]:  average dice: [0.598, 0.389, 0.283]  mean average dice : 0.42333333333333334 average sensitivity: [0.983, 0.984, 0.322]  mean average sensitivity : 0.763
[WT, TC, ET]:  average dice: [0.679, 0.438, 0.252]  mean average dice : 0.4563333333333333 average sensitivity: [0.983, 0.977, 0.254]  mean average sensitivity : 0.738
[WT, TC, ET]:  average dice: [0.678, 0.439, 0.262]  mean average dice : 0.45966666666666667 average sensitivity: [0.985, 0.982, 0.274]  mean average sensitivity : 0.747
[WT, TC, ET]:  average dice: [0.691, 0.52, 0.097]  mean average dice : 0.43599999999999994 average sensitivity: [0.98, 0.978, 0.066]  mean average sensitivity : 0.6746666666666666
[WT, TC, ET]:  average dice: [0.634, 0.314, 0.349]  mean average dice : 0.4323333333333333 average sensitivity: [0.993, 0.998, 0.47]  mean average sensitivity : 0.8203333333333335
[WT, TC, ET]:  average dice: [0.675, 0.473, 0.034]  mean average dice : 0.3940000000000001 average sensitivity: [0.987, 0.991, 0.022]  mean average sensitivity : 0.6666666666666666
[WT, TC, ET]:  average dice: [0.673, 0.499, 0.39]  mean average dice : 0.5206666666666667 average sensitivity: [0.974, 0.98, 0.406]  mean average sensitivity : 0.7866666666666666
[WT, TC, ET]:  average dice: [0.678, 0.423, 0.261]  mean average dice : 0.454 average sensitivity: [0.988, 0.994, 0.307]  mean average sensitivity : 0.763
[WT, TC, ET]:  average dice: [0.769, 0.513, 0.349]  mean average dice : 0.5436666666666666 average sensitivity: [0.983, 0.992, 0.346]  mean average sensitivity : 0.7736666666666667
[WT, TC, ET]:  average dice: [0.717, 0.501, 0.336]  mean average dice : 0.518 average sensitivity: [0.989, 0.99, 0.314]  mean average sensitivity : 0.7643333333333334
[WT, TC, ET]:  average dice: [0.787, 0.546, 0.446]  mean average dice : 0.5930000000000001 average sensitivity: [0.982, 0.99, 0.41]  mean average sensitivity : 0.794
[WT, TC, ET]:  average dice: [0.671, 0.572, 0.389]  mean average dice : 0.5439999999999999 average sensitivity: [0.982, 0.978, 0.364]  mean average sensitivity : 0.7746666666666666
[WT, TC, ET]:  average dice: [0.745, 0.573, 0.276]  mean average dice : 0.5313333333333333 average sensitivity: [0.982, 0.986, 0.223]  mean average sensitivity : 0.7303333333333333
[WT, TC, ET]:  average dice: [0.783, 0.598, 0.336]  mean average dice : 0.5723333333333334 average sensitivity: [0.983, 0.989, 0.277]  mean average sensitivity : 0.7496666666666667
[WT, TC, ET]:  average dice: [0.76, 0.642, 0.379]  mean average dice : 0.5936666666666667 average sensitivity: [0.985, 0.98, 0.33]  mean average sensitivity : 0.765

As you can see, the best result is [WT, TC, ET]: average dice: [0.787, 0.546, 0.446] Do you think it might be the problem with the parameters that you set on the "parameters.ini" ?Or is there any other augment for trainning data. Because of the limited computing resources, I didn't do more experiments.

Hello, I saw in the comment area that you have run this code before. Did you modify the .py script outside the path when you run this code? I have encountered some problems, would you mind helping me? Looking forward to your reply! Best wishes

Hello,I can't get the best result. My best reslt is average dice: [0.603,0.62,0.584]. Do you know how solve it? Thanks!

When only use HGG for training, I can get almost the same dice of WT and TC, but ET is lower, ET dice is 0.4.

I just git clone the latest version, so I am still training. Now I can't answer your question. It takes time. I will let you known when I finish it.

Hello, I saw in the comment area that you have run this code before. Did you modify the .py script outside the path when you run this code? I have encountered some problems, would you mind helping me? Looking forward to your reply! Best wishes

TIAN-Ww avatar Oct 26 '21 09:10 TIAN-Ww