Global-Encoding
Global-Encoding copied to clipboard
Some trouble when running your code
“Traceback (most recent call last):
File "train.py", line 332, in
您好,我想问一下数据的应该是什么样的呢?不是很懂作者说的这句话,是把txt换成src,tgt吗? Remember to put the data into a folder and name them train.src, train.tgt, valid.src, valid.tgt, test.src and test.tgt, and make a new folder inside called data
您好,我想问一下数据的应该是什么样的呢?不是很懂作者说的这句话,是把txt换成src,tgt吗? Remember to put the data into a folder and name them train.src, train.tgt, valid.src, valid.tgt, test.src and test.tgt, and make a new folder inside called data
是的,或者你可以以看一下preprocess.py line 27和 line 29,加上后缀参数
23333这个问题之前已经解决了,chmod 777 ROUGE-1.5.5.pl就行
那请问可以把你处理好的saved_data传到我的邮箱吗?[email protected] ,我按着作者所说的处理方式在train的时候报错RuntimeError: Length of all samples has to be greater than 0, but found an element in 'lengths' that is <= 0,另外model里面是不是不需要loss了,我看之前的版本有最近更新之后就没有loss了
那请问可以把你处理好的saved_data传到我的邮箱吗?[email protected] ,我按着作者所说的处理方式在train的时候报错RuntimeError: Length of all samples has to be greater than 0, but found an element in 'lengths' that is <= 0,另外model里面是不是不需要loss了,我看之前的版本有最近更新之后就没有loss了
对,新版本的loss写进模型里了,不需要调单独的loss。你可以检查下是不是数据里面有空行
Traceback (most recent call last):
File "train.py", line 322, in
Traceback (most recent call last):
File "train.py", line 322, in
Traceback (most recent call last): File "train.py", line 322, in main() File "train.py", line 314, in main train_model(model, data, optim, i, params) File "train.py", line 161, in train_model raise e File "train.py", line 141, in train_model loss, outputs = model(src, lengths, dec, targets) File "/home/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 489, in call result = self.forward(*input, **kwargs) File "/home/下载/Global-Encoding-master/models/seq2seq.py", line 40, in forward contexts, state = self.encoder(src, src_len.tolist()) File "/home/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 489, in call result = self.forward(*input, **kwargs) File "/home/下载/Global-Encoding-master/models/rnn.py", line 52, in forward embs = pack(self.embedding(inputs), lengths) File "/home/anaconda3/lib/python3.7/site-packages/torch/nn/utils/rnn.py", line 148, in pack_padded_sequence return PackedSequence(torch._C._VariableFunctions._pack_padded_sequence(input, lengths, batch_first)) RuntimeError: Length of all samples has to be greater than 0, but found an element in 'lengths' that is <= 0 在train时,报错,请问这个是需要改mini-batch大小吗?
是不是里面有空行?
Traceback (most recent call last): File "train.py", line 322, in main() File "train.py", line 314, in main train_model(model, data, optim, i, params) File "train.py", line 119, in train_model for src, tgt, src_len, tgt_len, original_src, original_tgt in trainloader: File "/home/mu/.conda/envs/dym/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 314, in next batch = self.collate_fn([self.dataset[i] for i in indices]) File "/home/mu/global-encoding/utils/data_helper.py", line 84, in padding src_pad = torch.zeros(len(src), max(src_len)).long() RuntimeError: sizes must be non-negative 在train时报错,请问这是什么报错
有可能是空行的问题诶