H-DenseUNet
H-DenseUNet copied to clipboard
a question about result
hello, author. I run the whole code, the result is not same as your result in the paper and competition. I want to know what improvements you have made.
Looking forward to your reply! Thank you very much
What is your result?
What is your result? Hello, The test result is submitted in the codalab lesion_dice_global: 0.732 lesion_dice: 0.76 lesion_dice_per_case: 0.645 liver_dice_global: 0.899 liver_dice: 0.903 liver_dice_per_case: 0.903
Looking forward to your reply! Thank you very much
What is your result? Hello, The test result is submitted in the codalab lesion_dice_global: 0.732 lesion_dice: 0.76 lesion_dice_per_case: 0.645 liver_dice_global: 0.899 liver_dice: 0.903 liver_dice_per_case: 0.903
Looking forward to your reply! Thank you very much
When I run the step 5, the training loss is bigger than 0.05. I am still trying to get the final result. Can we discuss through email or qq,[email protected].
Hi, we ran into the same result - has anyone figured out the source of the discrepancy? Perhaps there's a change in implementation?
What is your result? Hello, The test result is submitted in the codalab lesion_dice_global: 0.732 lesion_dice: 0.76 lesion_dice_per_case: 0.645 liver_dice_global: 0.899 liver_dice: 0.903 liver_dice_per_case: 0.903
Looking forward to your reply! Thank you very much
Hi! How do you evaluate dice? Can you please provide some scripts? Thanks!
LiTS 2017 Challenge submission
| | kemgine_zly 邮箱:[email protected] |
Signature is customized by Netease Mail Master
On 04/07/2020 10:07, Kyfafyd wrote:
What is your result? Hello, The test result is submitted in the codalab lesion_dice_global: 0.732 lesion_dice: 0.76 lesion_dice_per_case: 0.645 liver_dice_global: 0.899 liver_dice: 0.903 liver_dice_per_case: 0.903
Looking forward to your reply! Thank you very much
Hi! How do you evaluate dice? Can you please provide some scripts? Thanks!
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.