recommenders-addons icon indicating copy to clipboard operation
recommenders-addons copied to clipboard

`freq_var` is not saved in `save_weights()`, which causes `restrict()` does not work on loaded models

Open huangenyan opened this issue 1 year ago • 2 comments

Code to reproduce the issue

import tensorflow as tf
import tensorflow_recommenders as tfra
import tensorflow_recommenders_addons.dynamic_embedding as de

def build_model():
    embedding = de.keras.layers.Embedding(
        embedding_size=8,
        init_capacity=1000,
        restrict_policy=de.FrequencyRestrictPolicy,
        name='UserDynamicEmbeddingLayer',
    )
    return tf.keras.Sequential([
        embedding,
        tf.keras.layers.Dense(8, activation='relu'),
        tf.keras.layers.Dense(4),
        tf.keras.layers.Lambda(lambda x: tf.math.l2_normalize(x, axis=1)),
    ])

model = build_model()
model.compile(
    optimizer=de.DynamicEmbeddingOptimizer(tf.keras.optimizers.Adam()),
    loss=tf.keras.losses.BinaryCrossentropy(),
)

x_tensors = tf.convert_to_tensor([1, 2, 3, 4, 5], dtype=tf.int64)
y_tensors = tf.convert_to_tensor([1, 1, 1, 0, 0], dtype=tf.float32)
ds = tf.data.Dataset.from_tensor_slices((x_tensor, y_tensor)).batch(5)

model.fit(ds)

model.save_weights('test_model')

if we use the model in another file:

import tensorflow as tf
import tensorflow_recommenders as tfra
import tensorflow_recommenders_addons.dynamic_embedding as de

def build_model():
    embedding = de.keras.layers.Embedding(
        embedding_size=8,
        init_capacity=1000,
        restrict_policy=de.FrequencyRestrictPolicy,
        name='UserDynamicEmbeddingLayer',
    )
    return tf.keras.Sequential([
        embedding,
        tf.keras.layers.Dense(8, activation='relu'),
        tf.keras.layers.Dense(4),
        tf.keras.layers.Lambda(lambda x: tf.math.l2_normalize(x, axis=1)),
    ])

model = build_model()
model.compile(
    optimizer=de.DynamicEmbeddingOptimizer(tf.keras.optimizers.Adam()),
    loss=tf.keras.losses.BinaryCrossentropy(),
)

model.load_weights('test_model')
embedding = model.get_layer(index=0)
print(embedding.params.size())
print(embedding.params.restrict_policy.freq_var.size())
embedding.params.restrict(3) # no effect
print(embedding.params.size())

You'll find the size of freq_var is 0, and calling restrict() has no effect.

Is there a way to save freq_var so I can restirct the embedding size in a future training? This is a very common scenario in daily incremental training.

huangenyan avatar Feb 27 '24 07:02 huangenyan

Hi @MoFHeka, could you have time to help with it? Thank you!

rhdong avatar Feb 28 '24 17:02 rhdong

Hi any updates on this issue?

huangenyan avatar Mar 18 '24 03:03 huangenyan

This should be because freq_var is not caused by track. For the time being, you can use TF manual loaded apis to use. @huangenyan Fixed in PR #415

MoFHeka avatar May 28 '24 17:05 MoFHeka