nni icon indicating copy to clipboard operation
nni copied to clipboard

KeyError: 'aten::expand_as'

Open PanJinquan opened this issue 3 years ago • 8 comments

Describe the issue: image It seems that the operation of aten::expand_as is still not supported. If possible, please fix this issue as soon. Thanks! @sverrejoh @bgianfo

PanJinquan avatar Dec 10 '21 07:12 PanJinquan

@J-shang - is this a new support we shall add to backlog?

scarlett2018 avatar Mar 29 '22 02:03 scarlett2018

@scarlett2018 yes, I will support this op in v2.7 if possible.

J-shang avatar Mar 29 '22 02:03 J-shang

@scarlett2018 yes, I will support this op in v2.7 if possible.

have you fix it?

xuzhuang1996 avatar May 09 '22 06:05 xuzhuang1996

@xuzhuang1996 This is a simple workaround, https://github.com/microsoft/nni/pull/4852, ~but as this pr description said, there may be insufficient speedup near expand_as.~ (We've fixed the issue and it's speedup fine now) Any scenario that uses expand_as will help us to improve this op-related speedup. If this workaround does not meet you need, please contact us.

J-shang avatar May 13 '22 02:05 J-shang

@xuzhuang1996 This is a simple workaround, #4852, but as this pr description said, there may be insufficient speedup near expand_as. Any scenario that uses expand_as will help us to improve this op-related speedup. If this workaround does not meet you need, please contact us.

Sadly not:

aten::expand_as is not Supported!

xuzhuang1996 avatar May 17 '22 12:05 xuzhuang1996

Sadly not:

aten::expand_as is not Supported!

Could you show more message about this error? This pr has not yet entered the master branch, did you install it from the source code from commit checkout? If so, could you provide a simple example so that we can reproduce your problem?

FYI, we have tested it on the following simple example.

import torch
from nni.compression.pytorch.pruning import L1NormPruner
from nni.compression.pytorch.speedup import ModelSpeedup


class TestModel(torch.nn.Module):
    def __init__(self) -> None:
        super().__init__()
        self.fc1 = torch.nn.Linear(10, 5)
        self.fc2 = torch.nn.Linear(10, 1)
        self.fc3 = torch.nn.Linear(5, 2)

    def forward(self, x):
        a = self.fc1(x)
        b = self.fc2(x).expand_as(a)
        return self.fc3(a + b)


model = TestModel()
pruner = L1NormPruner(model, [{'op_names': ['fc1'], 'sparsity': 0.5}])
_, masks = pruner.compress()
pruner._unwrap_model()
print(masks)
ModelSpeedup(model, torch.rand(10, 10), masks).speedup_model()
print(model)

J-shang avatar May 18 '22 01:05 J-shang

hi @xuzhuang1996 Do you still have this problem?

Lijiaoa avatar Sep 08 '22 02:09 Lijiaoa

hi @xuzhuang1996 Do you still have this problem?

sadly not!

xuzhuang1996 avatar Sep 08 '22 03:09 xuzhuang1996