surface_normal_uncertainty icon indicating copy to clipboard operation
surface_normal_uncertainty copied to clipboard

Using NLL_ours as the loss function results in a negative loss

Open Taxalfer opened this issue 1 year ago • 3 comments

I wanted to try to train a new model using my own dataset, and when using NLL_ours as the Loss function, the loss value would gradually become negative during training. While training is normal when using L2 or AL, I don't know how to solve it. Looking forward to your reply.

Taxalfer avatar Nov 20 '23 06:11 Taxalfer

Hello,I also want to train a new model using my own dataset.But I noticed that "--dataset_name" only has nyu/scannet,I wonder that if I should load my data with /data/dataloader_custom.py? Or if there are any other steps I should take?Thank you.

Genshin-Impact-king avatar Nov 24 '23 09:11 Genshin-Impact-king

/data/dataloader_custom.py is used to load data when you use test.py, if you want to train your own data, you may need to write a new dataloader

Taxalfer avatar Nov 30 '23 01:11 Taxalfer

Hi, very sorry for the delayed response.

For NLL_ours, it is natural that the loss becomes negative. The likelihood can be higher than 1 and the loss (negative log likelihood) can thus be smaller than 0. There is nothing to worry about.

For custom datasets, you need to write your own dataloader, as different datasets have different format (e.g. for GT surface normals).

baegwangbin avatar Dec 01 '23 16:12 baegwangbin