Conan
Conan
Like this? ``` def get_output(self, train): forward = self.get_forward_output(train) backward = self.get_backward_output(train) if self.output_mode is 'sum': output = forward + backward elif self.output_mode is 'concat': output = T.concatenate([forward, backward], axis=2)...
In the Keras LSTM, return_sequences is handled this way: ``` if self.return_sequences: return outputs.dimshuffle((1, 0, 2)) return outputs[-1] ``` Are you sure the tensor doesn't need to be transposed? To...
Ok, it seems to be working. I'm using two stacked LSTMs like this: ``` model = Sequential() model.add(BiDirectionLSTM(embedding_size, hidden_size, init=initialize)) model.add(Dense(hidden_size, hidden_size, init=initialize)) model.add(Activation('relu')) model.add(RepeatVector(maxlen)) model.add(BiDirectionLSTM(hidden_size, hidden_size, return_sequences=True, init=initialize)) model.add(TimeDistributedDense(hidden_size,...
Hey, thanks for getting back Michael. I'll also be working on this and will let you know if I make any progress.
I have an update regarding this. The models trained straight from the Google C codebase did not read correctly. However, the following steps made it possible to load them into...