SiaNet icon indicating copy to clipboard operation
SiaNet copied to clipboard

Add recurrent_dropout and recurrent_regularizer to LSTM Layer

Open bmigette opened this issue 6 years ago • 5 comments

Hello,

Would be great to consider adding ecurrent_dropout and recurrent_regularizer parameters to LSTM Layer See: https://keras.io/layers/recurrent/#lstm

bmigette avatar Jan 16 '18 12:01 bmigette

LSTM is bit tricky and I am in need of support from CNTK team. If you know any developers who understand RNN very well please please request him/her to contribute.

deepakkumar1984 avatar Jan 18 '18 22:01 deepakkumar1984

I know how theorie goes for RNN And dropout, however I am not familiar with CNTK At all...

Maybe this could help: https://stackoverflow.com/questions/44924690/keras-the-difference-between-lstm-dropout-and-lstm-recurrent-dropout

https://pdfs.semanticscholar.org/3061/db5aab0b3f6070ea0f19f8e76470e44aefa5.pdf

http://www.aclweb.org/anthology/C16-1165

If you have support from CNTK Team, maybe worth to ask them if there's some example available.

bmigette avatar Jan 19 '18 09:01 bmigette

I have asked for help. Will wait and hopefully can implement soon.

deepakkumar1984 avatar Jan 25 '18 03:01 deepakkumar1984

Good stuff !

bmigette avatar Jan 25 '18 13:01 bmigette

Any update on this ?

bmigette avatar Feb 26 '18 13:02 bmigette