Patrick Orlando
Patrick Orlando
@houghtonweihu the `index` method expects a tensor of embeddings as input, you need to use the index_from_dataset method to index from a dataset. Example: ```python brute_force = tfrs.layers.factorized_top_k.BruteForce(model.user_model) brute_force.index_from_dataset( movies.batch(128).map(lambda...
Hi @JV-Nunes, With tf-recommenders it should be easy to try both multi-task and separate mode, but if you can't, I would initially develop each model separately. How you formulate the...
In general, you want to tune and evaluate on the thing you actually care about. In this case **top k accuracy**.
`self.user_model` is your `user_model` class and not an instance of that class. In `compute_loss()`, you think you're calling `user_model.call()`, but actually you are calling the `user_model.__init__()`. You'll be able to...
Yes, this should technically be possible. Usually it isn't done because the ranking model contains additional features, often features that required frequent updates. In terms of cross joining your query...
This is the general idea @rlcauvin ```python import tensorflow as tf # Define the tensors with shape (3, 1) tensor1 = tf.constant([['a'], ['b'], ['c']]) tensor2 = tf.constant([['x'], ['y'], ['z']]) #...
You can define a forward pass and use it in the `compute_loss` function. This _should_ work, but there_ is a caveat. If your model architecture has layers that are shared...
Your calculation is correct @OmarMAmin.
@hkristof03 you are correct, but regardless of the position in the row, the softmax is calculated over all negatives. The sampling probability for an item should be overall the probability...
> sampling probabilities should be computed from the train dataset for each item ID, then these probabilities should be joined to the validation dataset and the item dataset (for mixed...