SKNet
SKNet copied to clipboard
About "attention_vectors = torch.cat([attention_vectors, vector], dim=1)"
In your code,feas = torch.cat([feas, fea], dim=1);torch.cat([attention_vectors, vector], dim=1)
, however, in the original paper,V=aU1+bU2, if "dim" in th.cat([attention_vectors, vector]) is 1, I think the "dim" is 0,am I right?
I'd rather join you in the doubt
It seems that the SoftMax(dim=1) should be SoftMax(dim=0)