Results 132 comments of massquantity

Okay, it seems that adding `torch.manual_seed` before the code is not effective. However, adding it in the [`get_batch_loader`](https://github.com/massquantity/LibRecommender/blob/master/libreco/batch/batch_data.py#L46) function works. ```python def get_batch_loader(model, data, neg_sampling, batch_size, shuffle, num_workers=0): torch.manual_seed(42) ......

Yep, you should use the latest commit version. I think recent updates may affect this. numpy 1.23.4 pandas 1.4.3 TensorFlow 2.12.0 torch 2.0.1 scikit-learn 1.1.1 scipy 1.8.1 My results on...

What about CPU results? I'm using CPU since currently i can only have access to my laptop. Based on your list I don't think packages are an issue now.

I'm also uncertain about it. Try this, https://wandb.ai/sauravmaheshkar/RSNA-MICCAI/reports/How-to-Set-Random-Seeds-in-PyTorch-and-Tensorflow--VmlldzoxMDA2MDQy

Although we mainly use tf1, the package installed is tf2, which may invoke some issues. I think you can set both tf1 and tf2 seed: ```python import tensorflow as tf2...

Yes they are both implemented in tf. The main difference is that NCF uses additional dense layers. I've asked ChatGPT, and here is what I got,

In TensorFlow 1.x, setting the random seed for GPU operations requires additional steps compared to setting the random seed for CPU operations. This is because GPU operations involve additional sources...

Yeah, looks like there is nothing we can do about this problem :)

Hi, i think nothing special needs to be taken care of. Just follow the instructions in the library documentation.

@gms101 What is your overall data size? All the data will be loaded into memory before training, so TB-size data will be a concern.