NLP-Project
NLP-Project copied to clipboard
Bilstm中,数据流动形状有疑问
states, hidden = self.encoder(embeddings.permute([1, 0, 2]))#[75,64,50],[seq_len, batch, embed_dim] encoding = torch.cat([states[0], states[-1]], dim=1)#张量拼接[32,512] 对这一段没看懂 能方便加您vx,仔细请教吗 愿意付费