coremltools icon indicating copy to clipboard operation
coremltools copied to clipboard

PyTorch to CoreML BatchNorm model SCRIPT conversion v4 fails with "temporary: the only valid use of a module is looking up an attribute but found = prim::SetAttr[name="num_batches_tracked"](%13, %80)"

Open leovinus2001 opened this issue 4 years ago • 4 comments

Relevance:-----------------------------------------------------------

Firstly, while the JIT conversion of the attached test modeland code works fine, the torch.jit.script() fails on the BatchNorm conversion. The conversion of both BatchNorm1D and BatchNorm2D fails which is a blocker for using the script mode and converting dynamic models.

Secondly, I have seen another error on prim::SetAttr mentioned in Issue #802 on FasterRCNN conversion which may or may not be related.

Reproducible:-----------------------------------------------------------

Yes

Testcase:-----------------------------------------------------------

Attached. testScripting.bn.txt

Run e.g. as

python3 -O testScripting.bn.py

We see the error with useScriptingFlag = True but you could compare to the JIT models and behavior when used with useScriptingFlag = False

PS: Use can change self.use1DBatchNorm = True to check whether there is any difference in behavior on 1D or 2D BatchNorm. Both give the same error.

Error message/ Log:-----------------------------------------------------------

Traceback (most recent call last): File "testScripting.bn.py", line 64, in inputs= [ ct.TensorType(name="input1", shape=dummy_input.shape) ] File "~/Library/Python/3.7/lib/python/site-packages/coremltools/converters/_converters_entry.py", line 299, in convert **kwargs File "~/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/converter.py", line 120, in _convert prog = frontend_converter(model, **kwargs) File "~/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/converter.py", line 62, in call return load(*args, **kwargs) File "~/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 73, in load converter = TorchConverter(torchscript, inputs, outputs, cut_at_symbols) File "~/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 140, in init raw_graph, params_dict = self._expand_and_optimize_ir(self.torchscript) File "~/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 329, in _expand_and_optimize_ir torchscript.forward.graph, torchscript._c RuntimeError: temporary: the only valid use of a module is looking up an attribute but found = prim::SetAttr[name="num_batches_tracked"](%13, %80) :

leovinus2001 avatar Jul 25 '20 19:07 leovinus2001

Same issue 4.0b3 and PyTorch 1.6.0

leovinus2001 avatar Aug 20 '20 15:08 leovinus2001

Hello, have you been able to resolve this issue? @leovinus2001

HashedViking avatar Mar 21 '21 15:03 HashedViking

This is still an issue with coremltools 5.0

TobyRoseman avatar Oct 14 '21 00:10 TobyRoseman

Hello, have you been able to resolve this issue? @leovinus2001

chendadon avatar Jul 31 '22 13:07 chendadon