spotlight icon indicating copy to clipboard operation
spotlight copied to clipboard

Is dot product the right way to predict?

Open JoaoLages opened this issue 5 years ago • 2 comments

While training implicit sequence models, we use losses like hinge, bpr and pointwise. These losses don't maximize directly the dot product, so why do we use it while predicting?

JoaoLages avatar Apr 24 '19 14:04 JoaoLages

These losses maximize the difference between the dot products of the positive and (implicit) negative items, and so using the dot product for prediction is appropriate.

maciejkula avatar Apr 25 '19 03:04 maciejkula

@JoaoLages A bit late to the party but what we are really optimizing here are the embeddings for user or items whatever. The dot product is a mere operation to combine the two embeddings into one result. The backprop basically goes through the dot product and changes the embeddings in such a way that we get the results we want i.e. maximize the end result for positive items and minimize the result for the negative items. Correct me if I am wrong @maciejkula

nilansaha avatar Nov 15 '20 07:11 nilansaha