seya icon indicating copy to clipboard operation
seya copied to clipboard

Why is the backward output reversed across first axes in the bidirectional RNN?

Open ReallyCoolName opened this issue 8 years ago • 3 comments

Apologies if I'm missing something obvious, but when you get the output for the bidirectional RNN:

    Xf = self.forward.get_output(train)
    Xb = self.backward.get_output(train)
    Xb = Xb[::-1]

Seems you're reversing the first (batch) dimension, aren't you? (I'm assuming I'm wrong, since looking back at the file's history, this part hasn't changed, and no one else seems to be asking about this, but from what I understand, keras does every operation on batches)

ReallyCoolName avatar Jun 14 '16 14:06 ReallyCoolName

I think you are right. Did you do some unit test to confirm?

Also, would you have time to PR the change Xb = X[:, ::-1] along with some tests to prove correctness?

EderSantana avatar Jun 14 '16 18:06 EderSantana

Didn't do any tests, Just looked over the code when I was trying to figure out why it's complaining about build getting 2 arguments and noticed that. I gave up and went back to try to figure out how to use the graph model in Keras. (Not sure if the whole build thing is a problem with your implementation. Keras has been complaining alot) I'll probably have more time tomorrow to try to debug this, right now I'm trying to pull off a miracle :(

ReallyCoolName avatar Jun 15 '16 11:06 ReallyCoolName

btw, notice that this code base uses an older version of keras. there are other ways to do biRNN with keras_1

EderSantana avatar Jun 15 '16 20:06 EderSantana