ace icon indicating copy to clipboard operation
ace copied to clipboard

Running inference on mobile

Open daniel-sudz opened this issue 11 months ago • 4 comments

I was wondering if there would be any technical challenges that would make running inference on mobile impossible. The models themselves seem small but I'm not familiar enough to know if there are any other challenges/considerations.

Update 1

Currently I am on the following error when trying to convert the model to torchscript. Seems like it would be easy enough to fix but there would probably be more things that don't work afterwards as well.

Module 'Head' has no attribute 'res_blocks' (This attribute exists on the Python module, but we failed to convert Python type: 'list' to a TorchScript type. Could not infer type of list element: Cannot infer concrete type of torch.nn.Module. Its type was inferred; try adding a type annotation for the attribute.):
  File "/home/powerhorse/Desktop/daniel_tmp/benchmark/anchor/third_party/ace/ace_network.py", line 128
        res = self.head_skip(res) + x
    
        for res_block in self.res_blocks:
                         ~~~~~~~~~~~~~~~ <--- HERE
            x = F.relu(res_block[0](res))
            x = F.relu(res_block[1](x))

daniel-sudz avatar Jul 11 '23 16:07 daniel-sudz