manamir

Results 18 comments of manamir

right, SoftMax is fine.

is it possible to define another loss function than NLL?

oh sorry for asking when the answer is pretty clear, i gues i should create a new class or add in the NLL.lua file a new loss function :)

hi Nicolas, do you have any clue about using MSECriterion or KLDivergenceCriterion ? for both i have this error : opt/tools/torch7/install/share/lua/5.1/dp/loss/loss.lua:82: inconsistent tensor size

ok i'll do that, the thing is that the only difference with the recurrentlanguagemodel.lua script is that i replaced "loss = opt.softmaxtree and dp.TreeNLL() or dp.NLL()," by "loss = dp.KLDivergence(),"

i just added RNN.lua and penntreebank.lua files

here https://gist.github.com/manamir/ac31cedb32ff24db1796

oh CE stands for CrossEntropy, i'll add it to the git