BayesianRNN icon indicating copy to clipboard operation
BayesianRNN copied to clipboard

Variational Dropout in Keras

Open cjnolet opened this issue 7 years ago • 5 comments

I notice in your readme it states the Variational Dropout algorithm has been implemented in Keras’ RNN library.

I want to verify that it is implemented exactly as described in your paper (that the exact dropped out connections remain constant throughout training).

I’m implementing the encoder-decoder framework from the paper on Uber’s timeseries anomaly prediction model.

Thank you.

cjnolet avatar Dec 28 '17 02:12 cjnolet

It should be - I helped with the implementation!

yaringal avatar Dec 28 '17 09:12 yaringal

go ahead now :)

Namaste

Rohit

On Thu, Dec 28, 2017 at 8:11 PM, yaringal [email protected] wrote:

It should be - I helped with the implementation!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/yaringal/BayesianRNN/issues/5#issuecomment-354255980, or mute the thread https://github.com/notifications/unsubscribe-auth/AIIqmkici_pS8JmfJL2dipVl2Z7bKPcbks5tE1tcgaJpZM4RN90L .

rohitash-chandra avatar Dec 28 '17 09:12 rohitash-chandra

Awesome! Any ideas on how I might implement Monte Carlo dropout with it?

I know this might be the wrong place to ask but given that you helped with the implementation, I figure it’s good to get it from the source. Is there some way to keep the recurrent dropout running in the forward pass?

Thanks again!

Sent from my iPhone

On Dec 28, 2017, at 4:23 AM, Rohitash Chandra [email protected] wrote:

go ahead now :)

Namaste

Rohit

On Thu, Dec 28, 2017 at 8:11 PM, yaringal [email protected] wrote:

It should be - I helped with the implementation!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/yaringal/BayesianRNN/issues/5#issuecomment-354255980, or mute the thread https://github.com/notifications/unsubscribe-auth/AIIqmkici_pS8JmfJL2dipVl2Z7bKPcbks5tE1tcgaJpZM4RN90L .

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

cjnolet avatar Dec 28 '17 12:12 cjnolet

Keras has a new flag Training (I think) that you can pass in the construction of the layer - this should keep dropout enabled at test time as well. Otherwise, have a look at some of my recent repos for how to compile a Keras model to do sampling at test time (eg acquisition function example)

yaringal avatar Dec 28 '17 12:12 yaringal

Hi @cjnolet,

I am also trying to reproduce the framework from the Uber's paper in Python with Keras and Tensorflow backend but I'm not sure about the correctness of my implementation. Did you manage to code it, and is the code available somewhere?

Thanks a lot

rutagara avatar Dec 05 '18 15:12 rutagara