Results 95 comments of massquantity

The algorithm uses the first one. I don't know if it is the best. Maybe your data is very sparse, and every user only have a few interactions, so the...

No, this library doesn't deal with the explanation problem...

Training loss appears in `libreco/algorithms/base.py`, line 333-337. Since during training, the process needs to calculate the training loss to update the model. So if we calculate the training loss again...

The training_loss is computed using [tf.nn.sigmoid_cross_entropy_with_logits](https://www.tensorflow.org/api_docs/python/tf/nn/sigmoid_cross_entropy_with_logits), and the eval loss is computed using [sklearn.metrics.log_loss](https://scikit-learn.org/stable/modules/generated/sklearn.metrics.log_loss.html). The math equations are the same, so if you trust the implementations of both TensorFlow ans...

Just call `tf.compat.v1.reset_default_graph()` before rerunning. ```python >>> svdpp = SVDpp(...) >>> svdpp.fit(...) >>> tf.reset_default_graph() >>> svdpp = SVDpp(...) >>> svdpp.fit(...) ```

The example data only has 100 thousand rows, which is far from enough to train a good model. Besides, are you implying that a "sensible" recommendation is items with similar...

Yeah YoutubeRanking's attribute is wrong, which will be fixed in the next version. DeepFM doesn't use item sequence as DIN does, and the doc hasn't been updated to adapt to...

I couldn't think of a clean way to do this. The `_set_latent_vectors` function is just used to speed up recommendation. In other words, it uses memory to exchange for speed....

Yeah in real world scenarios it is not uncommon to filter users with few interactions. But the library doesn't support that ​and I think it's easy to filter them using...