MiDaS
MiDaS copied to clipboard
swin2_tiny failed to run forward(): RuntimeError: unflatten: Provided sizes [64, 64] don't multiply up to the size of dim 2 (64) in the input tensor.
I tried with
model = DPTDepthModel(
path=None,
backbone="swin2t16_256",
non_negative=True,
)
During inference at https://github.com/isl-org/MiDaS/blob/bdc4ed64c095e026dc0a2f17cabb14d58263decb/midas/backbones/utils.py#L72 it gave the error
RuntimeError: unflatten: Provided sizes [64, 64] don't multiply up to the size of dim 2 (64) in the input tensor
The input at this layer is of a shape (b, 64, 64, 96) where b is the batch size. The next operator pretrained.act_postprocess1
is a
Sequential(
(0): Transpose()
(1): Unflatten(dim=2, unflattened_size=torch.Size([64, 64]))
)
I don't think Unflatten(dim=2, unflattened_size=torch.Size([64, 64]))
work on any of the dimensions (b, 64, 64, 96)
. On the other hand it seems (b, 64, 64, 96)
has already been unflattened.
Did anyone tried training or inference with the swin backbones?
Hi,
See #259 for easy inference with DPT + Swin backbone