Xinzhou Jin

Results 3 issues of Xinzhou Jin

In your implementation setting, such as in Cora, hidden dim = 128, but in your code, you double it to `2 * out_channels`, is this reasonable? Apparently the current dimension...

你好!fm中关于二阶交叉这一部分是否存在问题呢?好像代码中只进行了sparse部分的二阶交叉,但是dense部分的代码并没有看到您进行交叉。

**Error message**: _File "./negation-learning/transformers/examples/lm_training/finetune_on_pregenerated_negation_distributed.py", line 22, in from transformers.modeling_bert import BertForPreTraining, BertConfig, BertForNegPreTraining, BertForNegSameBatch ImportError: cannot import name 'BertForNegPreTraining' from 'transformers.modeling_bert'_ **And I also did not find such a model...