keras icon indicating copy to clipboard operation
keras copied to clipboard

Keras 3 incompatibility set Embedding layer weights

Open jc-louis opened this issue 1 year ago • 3 comments

import numpy as np
from keras.layers import Embedding

# load already trained embedding matrix
embedding_matrix = np.random.rand(10, 10)

layer = Embedding(
    input_dim=embedding_matrix.shape[0],
    output_dim=embedding_matrix.shape[1],
    trainable=False,
    # set weights, working with Keras 2 but not 3
    weights=[embedding_matrix],
)

I'm trying to migrate to Keras 3.0 and I'm having issues with the Embedding layer. I need to set the embedding weights but I get this error:

ValueError: Unrecognized keyword arguments passed to Embedding: {'weights': [array([[...

jc-louis avatar Feb 19 '24 10:02 jc-louis

Hi @jc-louis ,

In keras2 can you confirm whether layer.get_weights() retrieves weight set in embedding layer like you mentioned above?

SuryanarayanaY avatar Feb 19 '24 13:02 SuryanarayanaY

Yes, but it only works after compiling the model (before that layer.get_weights() returns [])

# keras v2 code
from keras.models import Model
from keras.layers import Input
import numpy as np
from keras.layers import Embedding

embedding_matrix = np.random.rand(4, 1)

layer = Embedding(
    input_dim=embedding_matrix.shape[0],
    output_dim=embedding_matrix.shape[1],
    weights=[embedding_matrix],
    trainable=False,
)

input_layer = Input(shape=(1,))
embedded = layer(input_layer)
model = Model(inputs=input_layer, outputs=embedded)
model.compile()

assert np.allclose(embedding_matrix, layer.get_weights()[0])
assert np.allclose(embedding_matrix, model.layers[1].get_weights()[0])

jc-louis avatar Feb 19 '24 15:02 jc-louis

Hi @jc-louis ,

Gone through Keras2 and confirm that weights is an accepted kwarg for Layer class as per code below. https://github.com/keras-team/keras/blob/601488fd4c1468ae7872e132e0f1c9843df54182/keras/engine/base_layer.py#L329-L335

In keras3,only 3 parameters are being allowed now i.e training , input_dim and input_shape .Other than these kwargs passed being checked and raising exception from below code.

https://github.com/keras-team/keras/blob/e6e62405fa1b4444102601636d871610d91e5783/keras/layers/layer.py#L264

I am not sure whether this is intended change or not.

But there is a way to achieve this behaviour in Keras3 using layer.set_weights method after compiling the model. Please refer to modified code in attached gist to achieve your requirement.

Hope it helps. Thanks!

SuryanarayanaY avatar Feb 21 '24 14:02 SuryanarayanaY

The weights= layer constructor argument was dropped in Keras 3. I've decided to add it back in the specific case of the Embedding layer, because unlike other layers it gets built in the constructor (so we don't have to store the weights value temporarily while waiting to build) and we had a few public code examples showing Embedding(..., weights=...).

So this is now fixed at HEAD.

Also, you can just do:

layer = Embedding(
    input_dim=embedding_matrix.shape[0],
    output_dim=embedding_matrix.shape[1],
    trainable=False,
)
layer.set_weights([matrix])

fchollet avatar Apr 11 '24 23:04 fchollet