deeppy icon indicating copy to clipboard operation
deeppy copied to clipboard

dp.Dropout failes when dropout = 0.0

Open filmo opened this issue 10 years ago • 0 comments

Missing an 'else' condition in fprop to handle passing 0.0 into dp.Dropout() to 'deactivate' a dropout layer.

Obviously, one can just comment out a dropout layer in the network definition, but probably better to allow '0' to be passed in to turn off Dropout when experimenting with network structures programatically.

filmo avatar Jan 14 '16 01:01 filmo