W2NER
W2NER copied to clipboard
去掉CNN层遇到的一些小白问题,希望得到大佬的解答
大佬,我想去掉整个CNN层,即三个Embeding层出来后直接接到协预测器模块,在修改过程遇到一些问题,一下没法解决了,希望得到大佬的帮助。 注释掉送入卷积层的channel
conv_input_size = config.lstm_hid_size + config.dist_emb_size + config.type_emb_size
self.convLayer = ConvolutionLayer(conv_input_size, config.conv_hid_size, config.dilation, config.conv_dropout)
self.dropout = nn.Dropout(config.emb_dropout)
并把这里
conv_inputs = torch.cat([dis_emb, reg_emb, cln], dim=-1)
conv_inputs = torch.masked_fill(conv_inputs, grid_mask2d.eq(0).unsqueeze(-1), 0.0)
conv_outputs = self.convLayer(conv_inputs)
conv_outputs = torch.masked_fill(conv_outputs, grid_mask2d.eq(0).unsqueeze(-1), 0.0)
outputs = self.predictor(word_reps, word_reps, conv_outputs)
修改为
conv_inputs = torch.cat([dis_emb, reg_emb, cln], dim=-1)
conv_outputs = torch.masked_fill(conv_inputs, grid_mask2d.eq(0).unsqueeze(-1), 0.0)
outputs = self.predictor(word_reps, word_reps, conv_outputs)
然后以下地方报错说 channels 未定义,现在还未解决,不知道像我这样操作,config.conv_hid_size * len(config.dilation)
需要改成什么?dis_embs+lstm_hidesize+reg_embs
?
self.predictor = CoPredictor(config.label_num, config.lstm_hid_size, config.biaffine_size,
config.conv_hid_size * len(config.dilation), config.ffnn_hid_size,
config.out_dropout
class CoPredictor(nn.Module):
def __init__(self, cls_num, hid_size, biaffine_size, channels, ffnn_hid_size, dropout=0):
super().__init__()
self.mlp1 = MLP(n_in=hid_size, n_out=biaffine_size, dropout=dropout)
self.mlp2 = MLP(n_in=hid_size, n_out=biaffine_size, dropout=dropout)
self.biaffine = Biaffine(n_in=biaffine_size, n_out=cls_num, bias_x=True, bias_y=True)
self.mlp_rel = MLP(channels, ffnn_hid_size, dropout=dropout)
self.linear = nn.Linear(ffnn_hid_size, cls_num)
self.dropout = nn.Dropout(dropout)
用conv_input_size
替换掉config.conv_hid_size * len(config.dilation)
即可
用
conv_input_size
替换掉config.conv_hid_size * len(config.dilation)
即可
好的,谢谢