icedpanda
icedpanda
I ran the parameters (`batch size`: 32 and `epochs` for recommender: 30) from the original paper but was still not able to get similar results. **Recommendation** ``` 2022-06-01 11:15:52.359 |...
Modified `gen_evaluate` due to #42. The results I got so far: **Recommendation** ``` 2022-06-07 20:04:28.364 | INFO | crslab.system.kgsf:train_recommender:147 - [Test] 2022-06-07 20:04:28.512 | INFO | crslab.data.dataloader.base:get_data:54 - [Finish dataset...
Hi @Oran-Ac, follow the issue here I modified the `gen_evaluate` by adding a simple split function. This would give me the correct tokens instead of characters. ```python def gen_evaluate(self, hyp,...
Same here and my wandb dashboard doesn't even have the `gradient` section but all metric loggings are working. ```python self.logger.watch(self.rec_model.rec, log="gradients", log_graph=True, log_freq=10) rec_trainer = Trainer( max_epochs=self.rec_epochs, logger=[self.logger], check_val_every_n_epoch=1, val_check_interval=1.0,...
Hi @kptkin, my environment: - wandb: 0.13.2 - lighting 1.7.3 - torch: 1.10 - python: 3.9
@kptkin `wandb.watch` works now after a clean reinstall
Hi @nate-wandb, these warnings prompted before the training as well. However, I have printed the parameters before training and before testing and its correct. I tried to set a single...
Hi @nate-wandb, I found the warning was triggered by the `pl.save_hyperparameters()` as I have a nested dictionary. `pl` is saving it as a flat dictionary which has conflict with some...
NC6 is now available after a support ticket. It's weird that they need to put your subscription on a whitelist even though you have available NC vCores.
Hi @rusty1s, should I set `add_self_loop` to `False` using modules like `GATConv` if I have already added `self-loops` in the dataset?