deep-landmark icon indicating copy to clipboard operation
deep-landmark copied to clipboard

why the results are not stable?

Open wuqiangch opened this issue 8 years ago • 3 comments

Without changing other paramers, trainging again can get better (maybe worse) results?

wuqiangch avatar Mar 14 '16 01:03 wuqiangch

Because of the network initialization, there's no guarantee that the loss will be stable or able to reach a point gives you a low loss. You should consider to restart the training or tuning some parameters if the loss is not what you want.

luoyetx avatar Mar 14 '16 02:03 luoyetx

If I get more data and finetune the models . Maybe I can get better results, because I have get a better network initialization.

wuqiangch avatar Mar 14 '16 03:03 wuqiangch

more data can help training the network but the network initialization part has nothing to do with your data. Actually, the default weight initialization method is ok for most situation and the loss is likely to converge and become small with training going on.

luoyetx avatar Mar 14 '16 03:03 luoyetx