SDGCN
SDGCN copied to clipboard
Modeling Sentiment Dependencies with Graph Convolutional Networks for Aspect-level Sentiment Classification
我这边想跑通实验,希望您能提供帮助谢谢!
UnicodeDecodeError: 'gbk' codec can't decode byte 0xac in position 1724: illegal multibyte sequence
报错如下: Traceback (most recent call last): File "D:/project/SDGCN-master/run_glove.py", line 345, in Train, Test, word_embedding = preprocess() File "D:/project/SDGCN-master/run_glove.py", line 101, in preprocess word_id_mapping, w2v = data_helpers.load_w2v(FLAGS.embedding_file_path, FLAGS.word_embedding_dim) File "D:\project\SDGCN-master\data_helpers.py", line...
`File "D:/pythonProject/SDGCN-master/run_BERT.py", line 376, in train_acc, dev_acc, max_test_acc,max_test_F1_macro,max_test_step, train_all_softmax, test_all_softmax = train(Train, Test, word_embedding) File "D:/pythonProject/SDGCN-master/run_BERT.py", line 199, in train sequence_length=Train['x'].shape[1], IndexError: tuple index out of range` 这个错误应该怎么解决,'data/data_res/Restaurants_glove.42B.300d.txt'这个文件没有,自己放了个空的,不知道是不是这个的原因
HI, I encountered this error when running create_bert_embeddings AttributeError: module 'numpy' has no attribute 'gcd' Any ideas plz thanks
你好,请问一下您论文fig2中,第一个gcn中的节点代表什么呢
"Cannot create a tensor proto whose content is larger than 2GB.") ValueError: Cannot create a tensor proto whose content is larger than 2GB.
i have download the glove.42B.300d.zip.
UnicodeDecodeError: 'gbk' codec can't decode byte 0xa2 in position 1389: illegal multibyte sequence
请问:运行时出现这种错误怎么解决呢 SDGCN-master\data_helpers.py in load_w2v(w2v_file, embedding_dim, is_skip) 127 w2v.append([0.] * embedding_dim) 128 cnt = 0 --> 129 for line in fp: 130 cnt += 1 131 line = line.split() UnicodeDecodeError: 'gbk'...
The file create BERT embedding dose not produce the **_targets_embedding_file.