Dzmitry Bahdanau

Results 206 comments of Dzmitry Bahdanau

My opinion is that we can just add these trainable biases without `use_bias` option. However, I have a bad feeling about using `biases_init` initialization scheme for that, because what we...

No rush, if you are still planning to test this change, I will not close the PR. On 28 October 2015 at 02:14, ablavatski [email protected] wrote: > Sorry, closed by...

Thanks for this insight Tim! Do you have an experience with this trick, does it actually help? On 6 January 2016 at 15:18, Tim Cooijmans [email protected] wrote: > This is...

This is indeed a situation that we have not foreseen. In theory, we support multiple parents, but as it turns out there might be side effects. Even if allocation would...

Sure, I will try both changes and see if they break things no further then tomorrow morning. What about having a `force` keyword argument for both `allocate()` and `initalize()` that...

That's a great wish list. I will try to propose a solution tomorrow that will include (a) changes to the code, that should amount to just a few lines (b)...

Sorry, booked for today to implement an adapter between Blocks and Platoon, but this issue is on the very top of my TODO list.

You can't say that LSTM is not compatible with SequenceContentAttention. The attention class is completely agnostic to where the data it processed comes from. Just pass "states" as the "attended"...

Can you please try the most recent Fuel?