ranking icon indicating copy to clipboard operation
ranking copied to clipboard

Most of the TFR Losses are always zero

Open ashfaq1701 opened this issue 1 month ago • 0 comments

Hi,

I am trying to build a model based on tensorflow ranking. The project will try to learn the ranking rule of some online service. My features are 0-1 scaled properties of listings and labels are the numeric ranks (Positive whole numbers > 0, 1 - 500 usually).

Here is my model,

class RankingModel(tf.keras.Model):

    def __init__(self, n_feats, **kwargs):
        super(RankingModel, self).__init__(**kwargs)

        self.n_feats = n_feats

        self.inner_layers = [
            Dense(1024, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(1024, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(512, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(512, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(256, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(256, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(128, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(128, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(64, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(64, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(32, activation="relu"),
            BatchNormalization(),
            Dropout(0.2),

            Dense(32, activation="relu"),
            BatchNormalization(),
            Dropout(0.2)
        ]

        self.o = Dense(1)

    def call(self, inputs):
        x = self.inner_layers[0](inputs)

        for layer in self.inner_layers[1:]:
            x = layer(x)

        o = self.o(x)
        return o

And here are the compilation and training statements,

import tensorflow_ranking as tfr

optimizer = tf.keras.optimizers.Adagrad(0.5)

loss = tfr.keras.losses.SoftmaxLoss()
eval_metrics = [
    tfr.keras.metrics.get(key="ndcg", name="metric/ndcg"),
    tfr.keras.metrics.get(key="mrr", name="metric/mrr")
]
model.compile(optimizer=optimizer, loss=loss, metrics=eval_metrics)


model.fit(train_ds, validation_data=valid_ds, epochs=3)

The datasets are properly intialized. My device is a M2 Macbook.

Any loss function I use in the tfr package, it calculates 0 in all epochs. What I am doing wrong?

Moreover, the ndcg metric is getting calculated to infinity.

ashfaq1701 avatar May 11 '24 04:05 ashfaq1701