nnUNet icon indicating copy to clipboard operation
nnUNet copied to clipboard

The training and validation loss dropped to -0.6, and it stopped dropping

Open MOMOANNIE opened this issue 2 years ago • 8 comments

Hello FabianIsensee, I use nnunet to train my own data. After 1000 epochs of training, my training loss and validation loss are both -0.6+, and they don't continue to decrease. What's going on? image Below is my training process: image

MOMOANNIE avatar Jan 09 '23 06:01 MOMOANNIE

Hello,

sorry for the late response! Is this question still relevant? To me it looks the loss might still decrease if training is continued. In what way did the loss no further decrease? Did you at any point run e.g. 2000 epochs?

Cheers Ole

dojoh avatar Aug 02 '23 12:08 dojoh

Is this question still relevant

Yes, this also happens when I train with nnunetV2 image

Did you at any point run e.g. 2000 epochs

I have tried to modify the number of epochs, but failed, it is still 1000, what needs to be modified to accurately modify the number of epochs?

MOMOANNIE avatar Aug 16 '23 01:08 MOMOANNIE

There are a couple of predefined trainers with more epochs, see https://github.com/MIC-DKFZ/nnUNet/blob/b4e97fe38a9eb6728077678d4850c41570a1cb02/nnunetv2/training/nnUNetTrainer/variants/training_length/nnUNetTrainer_Xepochs.py

You can invoke these trainers using the -tr flag, e.g. nnUNetv2_train DATASET_NAME_OR_ID UNET_CONFIGURATION FOLD -tr nnUNetTrainer_8000epochs

dojoh avatar Sep 04 '23 08:09 dojoh

Is this question still relevant

Yes, this also happens when I train with nnunetV2 image

Did you at any point run e.g. 2000 epochs

I have tried to modify the number of epochs, but failed, it is still 1000, what needs to be modified to accurately modify the number of epochs?

Is this question still relevant

Yes, this also happens when I train with nnunetV2 image

Did you at any point run e.g. 2000 epochs

I have tried to modify the number of epochs, but failed, it is still 1000, what needs to be modified to accurately modify the number of epochs?

Hello, Is it normal for loss to be a negative number

975827738 avatar Apr 23 '24 12:04 975827738

这个问题仍然相关吗

是的,当我使用 nnunetV2 训练时也会发生这种情况 image

您是否在任何时候运行过,例如 2000 个时代

我试过修改纪元数,但失败了,还是1000个,需要修改什么才能准确修改纪元数?

这个问题仍然相关吗

是的,当我使用 nnunetV2 训练时也会发生这种情况 image

您是否在任何时候运行过,例如 2000 个时代

我试过修改纪元数,但失败了,还是1000个,需要修改什么才能准确修改纪元数?

你好 损失为负数正常吗

这个问题仍然相关吗

是的,当我使用 nnunetV2 训练时也会发生这种情况 image

您是否在任何时候运行过,例如 2000 个时代

我试过修改纪元数,但失败了,还是1000个,需要修改什么才能准确修改纪元数?

这个问题仍然相关吗

是的,当我使用 nnunetV2 训练时也会发生这种情况 image

您是否在任何时候运行过,例如 2000 个时代

我试过修改纪元数,但失败了,还是1000个,需要修改什么才能准确修改纪元数?

你好 损失为负数正常吗

这个问题仍然相关吗

是的,当我使用 nnunetV2 训练时也会发生这种情况 image

您是否在任何时候运行过,例如 2000 个时代

我试过修改纪元数,但失败了,还是1000个,需要修改什么才能准确修改纪元数?

这个问题仍然相关吗

是的,当我使用 nnunetV2 训练时也会发生这种情况 image

您是否在任何时候运行过,例如 2000 个时代

我试过修改纪元数,但失败了,还是1000个,需要修改什么才能准确修改纪元数?

你好 损失为负数正常吗

I have also encountered the situation that loss is negative, may I ask if you have solved it? What is the basis for a negative loss value?

iWangTing avatar May 17 '24 10:05 iWangTing