PyData_Keras_Talk icon indicating copy to clipboard operation
PyData_Keras_Talk copied to clipboard

Fine tuned with "Stateful LSTMs, Stacked"

Open erturkkadir opened this issue 6 years ago • 2 comments

If you try "Stateful LSTMs, Stacked" with following parameters, you may get quick (may be better) solution in terms of frequency and phase (not amplitude),

batch_size = 1

model = Sequential() model.add(LSTM(128, batch_input_shape=(batch_size, look_back, 1), stateful=True, return_sequences=True)) model.add(Dropout(0.2)) model.add(LSTM(256, stateful=True)) model.add(Dense(1))

model.compile(loss='mse', optimizer='adadelta') for i in range(5): model.fit(trainX, trainY, epochs=1, batch_size=batch_size, verbose=2, shuffle=False) model.reset_states()

erturkkadir avatar Mar 13 '18 20:03 erturkkadir

Did you mean to put more nodes (256) in the upper layer or was that a typo. And thanks will try it out.

sachinruk avatar Mar 13 '18 22:03 sachinruk

I did since I noticed that the more unit in second stack gets better in this case

erturkkadir avatar Mar 13 '18 22:03 erturkkadir