seq2seq
seq2seq copied to clipboard
Setting Activations on layers
Hi,
I am working with seq2seq library, and I have some problems,....
I have vector that have values inside from 1 to 11399,... so my (training) vector is looking like : [1, 200, 1235, 11300,...]
But I always get back ('predicted') values from 0 to 1... like [0, 0.2, 0.3, 1,..]
I guess this is due to a activation function (softmax?). Is there a way to defined a activation function for seq2seq model layers?
Marko
I met the same problem. Did you solve it?
I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is y = Activation(self.activation)(W2(h))
. actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.
I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is
y = Activation(self.activation)(W2(h))
. actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.
I think you are right. The default activation function is tanh. Also normalization the data is another choice to solve the problem.