yeye
Results
2
issues of
yeye
Hello, author. I am sincerely that you can answer me when you saw. I urgently want to realize why there are Q, K, V as input in multi-head attention and...
https://github.com/benedekrozemberczki/pytorch_geometric_temporal/blob/b2833bec8a0ba3e0072969cf4505e97c25315e78/torch_geometric_temporal/nn/attention/stgcn.py#L34-L37 Initially, X.permute(0, 3, 2, 1) is applied, and again at the end. Aren't the dimensions still consistent with the input shape: (batch_size, input_time_steps, num_nodes, in_channels)?