Add generic annotator for data layout ops
Data layout ops like unsqueeze are not annotated by the quantizer per default which leads to issues down the line. Therefore we add a generic annotator to explicitly annotate those ops.
:link: Helpful Links
:test_tube: See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/5814
- :page_facing_up: Preview Python docs built from this PR
Note: Links to docs will display an error until the docs builds have been completed.
:x: 3 New Failures, 1 Unrelated Failure
As of commit 265cdb7315095137d976a24e18852647973eb7b3 with merge base 393553ceecb26f73a189e7109c14503b34657c8e ():
NEW FAILURES - The following jobs have failed:
-
trunk / test-models-macos (cmake, ic3, xnnpack-quantization-delegation, macos-m1-stable, 90) / macos-job (gh)
Library not loaded: @rpath/liblz4.1.dylib -
trunk / test-models-macos (cmake, mul, portable, macos-m1-stable, 90) / macos-job (gh)
Library not loaded: @rpath/liblz4.1.dylib -
trunk / test-models-macos (cmake, vit, xnnpack-delegation, macos-m1-stable, 90) / macos-job (gh)
Library not loaded: @rpath/liblz4.1.dylib
BROKEN TRUNK - The following job failed but were present on the merge base:
👉 Rebase onto the `viable/strict` branch to avoid these failures
-
trunk / test-coreml-delegate / macos-job (gh) (trunk failure)
RuntimeError: Command bash /Users/runner/work/_temp/exec_script failed with exit code 1
This comment was automatically generated by Dr. CI and updates every 15 minutes.
Data layout ops like unsqueeze are not annotated by the quantizer per default which leads to issues down the line
Can you elaborate? what issues? Is it just lowering issues?
@digantdesai has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.
Data layout ops like unsqueeze are not annotated by the quantizer per default which leads to issues down the line
Can you elaborate? what issues? Is it just lowering issues?
Yes, it's a lowering issue. If the data layout op is in the beginning of the graph the forward propagation of annotations does not work (unless we annotate the inputs explicitly). We tried implementing a backward propagation as well, but sharing specs in the backward direction seems to be not supported, so we went for explicit annotation. Or are we missing something?
@digantdesai merged this pull request in pytorch/executorch@98a58e0dff16cf4c701785df1d41d16b2f5f1b44.