RecBole
RecBole copied to clipboard
[💡SUG] RepeatNET expensive tensor multiplications of very sparse matrices
Is your feature request related to a problem? Please describe.
The RepeatNet model contains expensive tensor multiplications of very sparse matrices which are implemented with dense operations and representations and therefore incur high overheads when the self.num_item
becomes large.
Describe the solution you'd like
Both the 'repeat' and the 'explore' modules use one-hot-encoding for each item in item_seq
with vector length self.num_item
and then these matrices are multiplied with a hidden state. The multiplication of these very sparse matrices are implemented with dense operations and representations and therefore incur high memory and computational overheads
Dense matrix operation in Repeat_Recommendation_Decoder
Dense matrix operation in Explore_Recommendation_Decoder
The multiplication of both matrices can be done efficiently using the sparse API of PyTorch.
Describe alternatives you've considered
Additional context
@bkersbergen Thank you for your suggestion! We will consider it in our next development plan.
You can check this modification of RepeatNet: https://github.com/iesl/softmax_CPR_recommend/commit/dccc0f631883ced3eccfee637dac015ddcf9c151#diff-83d3abd653e8e8377a420b979057df9d9064c531bc608ffc08d2e3c7b8be1004
This code lets you avoid the expensive multiplication without using sparse API of PyTorch.
@ken77921 Thank you for providing the code! We will check it and give feedback to you soon.
@bkersbergen @ken77921 Hello! We have optimized the expensive tensor multiplication in repeatnet. The detail is available in #1916. Thanks again for your suggestion and support!
Great to hear that optimizations for the tensor multiplication in RepeatNET have been implemented! Thank you for addressing the inefficiency. I'll definitely check out the details. Your responsiveness to feedback is appreciated, and I'm glad I could contribute.