VisionMamba
VisionMamba copied to clipboard
Matrix dimensions do not match
As you can see, I am currently facing a problem. I modified the Transformer part of my code using the code you provided, but it shows that the dimensions of my matrix multiplication do not match. I hope you can solve my doubts.There is a dimension mismatch in the multiplication of x1_ssm and x2_ssm with z. At the same time, I am still confused. Doesn't vim say that mamba blocks are of two types, front and back? Why didn't I see this in the code you provided?
def forward_temporal(self, x,F):
B, J, C = x.shape
# Skip connection
skip = x
# Normalization
x = self.norm(x)
# Split x into x1 and x2 with linears
z1 = self.proj_x(x)
x1 = self.proj_z(x)
# forward
x1 = x1.reshape(B,C,J)
x1_rearranged = self.softplus(x1)
forward_conv_output = self.forward_conv1d(x1_rearranged)
forward_conv_output = forward_conv_output.reshape(B,J,C)
x1_ssm = self.forward_ssm(forward_conv_output)
# backward
x2 = x1.reshape(B,C,J)
x2_rearranged = self.softplus(x2)
backward_conv_output = self.backward_conv1d(x2_rearranged)
backward_conv_output = backward_conv_output.reshape(B,J,C)
x2_ssm = self.backward_ssm(backward_conv_output)
# Activation
z = self.activation(z1)
# matmul with z + backward ssm
x2 = x2_ssm * z
# Matmul with z and x1
x1 = x1_ssm * z
# Add both matmuls
x = x1 + x2
# Add skip connection
return x + skip
Upvote & Fund
- We're using Polar.sh so you can upvote and help fund this issue.
- We receive the funding once the issue is completed & confirmed by you.
- Thank you in advance for helping prioritize & fund our backlog.