recommenders
recommenders copied to clipboard
Any example for sequential recommendations?
Hi! Ive been reading the docs and I couldnt find any example for sequential recommendations, commonly used for session based recommenders.
Is that use case covered by the functionalities in this repo? Any example about it?
Thanks!
Hi @jiwidi I am looking for the same - best I found was this issue: https://github.com/tensorflow/recommenders/issues/119
@ydennisy Ill check it out thanks!
@maciejkula I would also like to suggest that I feel this could be quite a good tutorial to add if possible :)
Published 21 hours ago: https://github.com/tensorflow/recommenders/blob/main/docs/examples/sequential_retrieval.ipynb
Pretty version: https://www.tensorflow.org/recommenders/examples/sequential_retrieval
Yep, we created that tutorial after seeing your feature request here
I read the TFRS paper.
How can we share the same movie embeddings for movie past watches and movie label? This approach is mentioned on a paper image: https://i.imgur.com/LABo21v.png
Using the same vocab for building the embeddings is enough? Thanks!
I read the TFRS paper.
How can we share the same movie embeddings for movie past watches and movie label? This approach is mentioned on a paper image: https://i.imgur.com/LABo21v.png
Using the same vocab for building the embeddings is enough? Thanks!
Maybe @maciejkula or @windmaple could shed some light on this.. :)
You will need to share the embedding table you use for past movie watches and movie labels in order to share the embeddings.
You will need to share the embedding table you use for past movie watches and movie labels in order to share the embeddings.
Can I share the same embedding table even if I need to use a "string splitter" as the first layer, and adding a GlobalAveragePooling1D as the last one?
Past watches needs a splitter... movie labels don't...
As long as you pass the ids into the same Python embedding layer object, you will be sharing embeddings, regardless of the processing.