alphanlp
alphanlp
self.logits = tf.layers.dense(self.dec, len(en2idx))
TypeError: () got an unexpected keyword argument 'is_last'
UserWarning: Output "lambda_83" missing from loss dictionary. We assume this was done on purpose, and we will not be expecting any data to be passed to "lambda_83" during training. self.model.compile(optimizer,...
self.target_layer = TimeDistributed(Dense(o_tokens.num(), use_bias=False)) change to: self.target_layer = TimeDistributed(Dense(o_tokens.num(), activation='softmax', use_bias=False))
benebot_vector 和config 是什么?源码中没有,也没有相应模块
why the input_size of Linear if 2?
torch.nn.CrossEntropyLoss(weight=None, size_average=None, ignore_index=-100, reduce=None, reduction='mean')[SOURCE] This criterion combines nn.LogSoftmax() and nn.NLLLoss() in one single class. I guess F.softmax(logits) and CrossEntropyLoss() used together is wrong?