Muhan Zhang

Results 65 comments of Muhan Zhang

You can use loop_dataset(): https://github.com/muhanzhang/pytorch_DGCNN/blob/50f504131ca66382f3f078ae98e59aa3ae35b795/main.py#L223

https://github.com/muhanzhang/pytorch_DGCNN/blob/50f504131ca66382f3f078ae98e59aa3ae35b795/main.py#L151 `pred` are the predictions.

https://github.com/muhanzhang/pytorch_DGCNN/blob/50f504131ca66382f3f078ae98e59aa3ae35b795/main.py#L154 logits

I use the linux terminal. After saving the model, you need to rerun the script. In the script, you need to write a "if else" to skip the training part...

@HUI2021 You can uncomment this [line](https://github.com/muhanzhang/pytorch_DGCNN/blob/50f504131ca66382f3f078ae98e59aa3ae35b795/main.py#L178) to save the test graphs' raw logit scores (for binary classification). Take exponential of the logits to get the predicted probabilities for being class...

@HUI2021 Check this [line](https://github.com/muhanzhang/pytorch_DGCNN/blob/50f504131ca66382f3f078ae98e59aa3ae35b795/mlp_dropout.py#L63). The logits are computed by log_softmax. So taking exponential of them directly recovers the softmax, which is a probability distribution.

Didn't check completely but it seems basically correct. You don't need the following lines: with open('saved_model/test2/archive/data.pkl', 'rb') as parameters: saved_cmd_args = pickle.load(parameters) for key, value in vars(saved_cmd_args).items(): vars(cmd_args)[key] = value...

The label cannot be missing. You can use a dummy label 0 for all test graphs. For your case, use `./run_DGCNN.sh my_train_data 1 0` first to train and save the...

Then a workaround would be appending a dummy test graph to the `my_train_data`, and use `./run_DGCNN.sh my_train_data 1 1` to save the model. @andreitam11

@andreitam11 Did you modify the saved file name in `torch.save(classifier.state_dict(), 'saved_model/test2.bin')` and `classifier.load_state_dict(torch.load(model_name))` when you retrain and reload the model?