CodeFormer icon indicating copy to clipboard operation
CodeFormer copied to clipboard

Is it normal for big loss in stage2, cross entropy loss?

Open YilanWang opened this issue 2 years ago • 6 comments

I find the cross_entropy_loss is big in training stage2, between 1e-1~1e0. From a numerical perspective, it seems that this represents almost no optimization of codebook searching.

YilanWang avatar Apr 26 '23 05:04 YilanWang

How many steps did you train? I trained 8w steps and the cross_entropy_loss had been around 2, and it didn't seem to decrease from the beginning to the present.

HualeiZhuu avatar Apr 27 '23 02:04 HualeiZhuu

I run 20w steps, very strange results, but it is do improve codebook seraching ability. Maybe only this loss not works...

YilanWang avatar Apr 27 '23 02:04 YilanWang

In stage 1 and 2, do you use pre-trained models in https://github.com/sczhou/CodeFormer/releases/tag/v0.1.0?

HualeiZhuu avatar Apr 27 '23 03:04 HualeiZhuu

How many steps did you train? I trained 8w steps and the cross_entropy_loss had been around 2, and it didn't seem to decrease from the beginning to the present.

Hi, I met the same problem, do u solve it?

CoderSnack avatar Jul 26 '23 09:07 CoderSnack

although the date is 9/17/2023,I still think the losses of stage2 are wrong set. the lq_feat from traing model is from encoder, but the gt is from quantize.get_codebook_feat, i.e., the goal is align the encoder feat and quantize feat. It is very ridiculus.

YilanWang avatar Sep 16 '23 17:09 YilanWang