Taurus
Taurus
I think so, this would be a solution for issue #608. In training phrase, the preprocessing will repeat in each epoch even with completely constant input data.
I think the framework could run preprocess pipe line at first phrase, and then use data augmenter(If we have) and iterator process and feed batch data to train graph.
@kuhanw I think it might make sense. Using PAD as the initial input of the encoder results in the same initial state of the decoder corresponding different p(T|S). Naturally, the...
@natedingyifeng have you resloved this promblem?we encount same problem
I found the same error
I think so, the code probably is wrong: i = Activation(self.recurrent_activation)(z0) **f = Activation(self.recurrent_activation)(z0)** c = add([multiply([f, c_tm1]), multiply([i, Activation(self.activation)(z2)])]) o = Activation(self.recurrent_activation)(z3) h = multiply([o, Activation(self.activation)(c)]) y = Activation(self.activation)(W2(h))...