keras
keras copied to clipboard
Unrecognized keyword arguments passed to LSTM: {'batch_input_shape'
model = Sequential() model.add(LSTM(4, batch_input_shape=(1, X_train.shape[1], X_train.shape[2]), stateful=True)) model.add(Dense(1)) model.compile(loss='mean_squared_error', optimizer='adam') model.fit(X_train, y_train, epochs=100, batch_size=1, verbose=1, shuffle=False)
ValueError: Unrecognized keyword arguments passed to LSTM: {'batch_input_shape': (1, 1, 7)}
My version : TensorFlow version: 2.17.0 Keras version: 3.4.1
I've seen similar issue raised on stackoverflow. I was able to run the code 2 weeks ago without error. What new keyword argument should I use?
https://stackoverflow.com/questions/78805181/valueerror-unrecognized-keyword-arguments-passed-to-lstm-batch-input-shape
That's the usage in Keras 2. In Keras 3, you can refer to: https://github.com/keras-team/keras/blob/933579d3c4a585f236982d05d3a74921f9567415/keras/src/layers/rnn/rnn.py#L105-L110
@Ineedsomehelpah
Just adding up to the previous comment. Note that:
Inputlayer was missing the description for the parameterbatch_shapein 3.4.1. And there is still the need to update RNN's description if I understand correctly, otherwise it'd still fail.- . You could do:
import keras
from keras.models import Sequential
from keras.layers import LSTM, Dense, Input
X_train = keras.random.normal((50,20,3))
y_train = keras.random.normal((50, 1))
model = Sequential()
model.add(Input(batch_shape=X_train.shape))
model.add(LSTM(units=4, stateful=True))
model.add(Dense(1))
model.compile(loss='mean_squared_error', optimizer='adam')
model.fit(X_train, y_train, epochs=100, batch_size=1,)
Here is the code with slight modification that works with batch_input_shape
import keras
from keras.models import Sequential
from keras.layers import LSTM, Dense, Input, InputLayer
X_train = keras.random.normal((50,20,3))
y_train = keras.random.normal((50, 1))
model = Sequential()
model.add(InputLayer(batch_input_shape=X_train.shape))
model.add(LSTM(units=4, stateful=True))
model.add(Dense(1))
model.compile(loss='mean_squared_error', optimizer='adam')
model.fit(X_train, y_train, epochs=100, batch_size=1,)
Really appreciate all your feedback, it's working for me now. Thanks all
Thanks for confirming, could you please close the issue as well.
Here is the code with slight modification that works with
batch_input_shapeimport keras from keras.models import Sequential from keras.layers import LSTM, Dense, Input, InputLayer
X_train = keras.random.normal((50,20,3)) y_train = keras.random.normal((50, 1))
model = Sequential() model.add(InputLayer(batch_input_shape=X_train.shape)) model.add(LSTM(units=4, stateful=True)) model.add(Dense(1)) model.compile(loss='mean_squared_error', optimizer='adam') model.fit(X_train, y_train, epochs=100, batch_size=1,)
Hello! Could you create a similar example with SimpleRNN layer using functional API instead? I am having issues reproducing the logic using tensorflow 2.18 and keras 3.8. and functional API, when training the model. So replace LSTM with RNN with stateful=True and keep the dense layer. However, the official documentation states the following:
Else for functional model with 1 or more Input layers:
`batch_shape=(...)` to all the first layers in your model.
This is the expected shape of your inputs
*including the batch size*.
It should be a tuple of integers, e.g. `(32, 10, 100)`.
- Specify `shuffle=False` when calling `fit()`.
To reset the states of your model, call `.reset_states()` on either
a specific layer, or on your entire model.
Not sure how .reset_states() is to be combined with stateful=True and whether it makes sense? Thank you for your time.
I am unable to resolve this problem, i hadnot used this parameter explicitly anywhere, but on saving it automatically comes to the binary file of my model, which reflects as an error when I am loading that model.