seq2seq icon indicating copy to clipboard operation
seq2seq copied to clipboard

Setting Activations on layers

Open ZadravecM opened this issue 6 years ago • 3 comments

Hi,

I am working with seq2seq library, and I have some problems,....

I have vector that have values inside from 1 to 11399,... so my (training) vector is looking like : [1, 200, 1235, 11300,...]

But I always get back ('predicted') values from 0 to 1... like [0, 0.2, 0.3, 1,..]

I guess this is due to a activation function (softmax?). Is there a way to defined a activation function for seq2seq model layers?

Marko

ZadravecM avatar Aug 27 '18 11:08 ZadravecM

I met the same problem. Did you solve it?

mladl avatar Sep 27 '18 05:09 mladl

I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is y = Activation(self.activation)(W2(h)). actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.

yyb1995 avatar Nov 04 '18 07:11 yyb1995

I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is y = Activation(self.activation)(W2(h)). actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.

I think you are right. The default activation function is tanh. Also normalization the data is another choice to solve the problem.

mladl avatar Nov 04 '18 09:11 mladl