seya
seya copied to clipboard
Why is the backward output reversed across first axes in the bidirectional RNN?
Apologies if I'm missing something obvious, but when you get the output for the bidirectional RNN:
Xf = self.forward.get_output(train)
Xb = self.backward.get_output(train)
Xb = Xb[::-1]
Seems you're reversing the first (batch) dimension, aren't you? (I'm assuming I'm wrong, since looking back at the file's history, this part hasn't changed, and no one else seems to be asking about this, but from what I understand, keras does every operation on batches)
I think you are right. Did you do some unit test to confirm?
Also, would you have time to PR the change Xb = X[:, ::-1]
along with some tests to prove correctness?
Didn't do any tests, Just looked over the code when I was trying to figure out why it's complaining about build
getting 2 arguments and noticed that. I gave up and went back to try to figure out how to use the graph model in Keras.
(Not sure if the whole build thing is a problem with your implementation. Keras has been complaining alot)
I'll probably have more time tomorrow to try to debug this, right now I'm trying to pull off a miracle :(
btw, notice that this code base uses an older version of keras. there are other ways to do biRNN with keras_1