Transformers4Rec
Transformers4Rec copied to clipboard
Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation and works with PyTorch.
### Description Currently all our unit tests for `data_loader_engine='pyarrow'` checks the training and eval_loop on CPU by setting `no_cuda=True`. We need to check the same on GPU with `no_cuda=False`.
# ❓ Questions & Help ## Details As titled, how can **Transformers4Rec** be extended to handle next basket recommendation, not just item recommendation?
# 🚀 Feature request The original PyTorch implementation of `TabularDropout` transformation is available at [transformers4rec/torch/tabular/transformations.py](https://github.com/NVIDIA-Merlin/Transformers4Rec/blob/main/transformers4rec/torch/tabular/transformations.py)
Currently blocked by #52.
Port the [TabularLayerNorm ](https://github.com/NVIDIA-Merlin/Transformers4Rec/blob/538fc54bb8f2e3dc79224e497bebee15b00e4ab7/transformers4rec/torch/tabular/transformations.py#L90)from PyTorch to TF
# 🚀 Feature request & Motivation Make it possible to initialized embedding tables with pre-trained embeddings. A common use case is providing embeddings of items metadata such as textual description...
# 🚀 Feature request Some HF Transformer architectures support the recurrence mechanism, which allow for processing segments of a sequence and leveraging learnings from past segments, like XLNET and Transformer-XL....
# 🚀 Feature request Include support to pairwise losses functions (BPR-max, TOP1-max) ## Motivation Pairwise ranking losses functions are more scalable than cross-entropy loss. ## Notes An implementation BPR-max and...
- [ ] 1st - Intro on session-based recommendation, specificities, challenges, current libraries and SOTA architectures - [ ] 2nd - Intro to Transformers4Rec API and session-based rec. preprocessing with...