SDGCN icon indicating copy to clipboard operation
SDGCN copied to clipboard

Modeling Sentiment Dependencies with Graph Convolutional Networks for Aspect-level Sentiment Classification

Results 12 SDGCN issues
Sort by recently updated
recently updated
newest added

报错如下: Traceback (most recent call last): File "D:/project/SDGCN-master/run_glove.py", line 345, in Train, Test, word_embedding = preprocess() File "D:/project/SDGCN-master/run_glove.py", line 101, in preprocess word_id_mapping, w2v = data_helpers.load_w2v(FLAGS.embedding_file_path, FLAGS.word_embedding_dim) File "D:\project\SDGCN-master\data_helpers.py", line...

`File "D:/pythonProject/SDGCN-master/run_BERT.py", line 376, in train_acc, dev_acc, max_test_acc,max_test_F1_macro,max_test_step, train_all_softmax, test_all_softmax = train(Train, Test, word_embedding) File "D:/pythonProject/SDGCN-master/run_BERT.py", line 199, in train sequence_length=Train['x'].shape[1], IndexError: tuple index out of range` 这个错误应该怎么解决,'data/data_res/Restaurants_glove.42B.300d.txt'这个文件没有,自己放了个空的,不知道是不是这个的原因

HI, I encountered this error when running create_bert_embeddings AttributeError: module 'numpy' has no attribute 'gcd' Any ideas plz thanks

你好,请问一下您论文fig2中,第一个gcn中的节点代表什么呢

"Cannot create a tensor proto whose content is larger than 2GB.") ValueError: Cannot create a tensor proto whose content is larger than 2GB.

请问:运行时出现这种错误怎么解决呢 SDGCN-master\data_helpers.py in load_w2v(w2v_file, embedding_dim, is_skip) 127 w2v.append([0.] * embedding_dim) 128 cnt = 0 --> 129 for line in fp: 130 cnt += 1 131 line = line.split() UnicodeDecodeError: 'gbk'...

The file create BERT embedding dose not produce the **_targets_embedding_file.