executorch
executorch copied to clipboard
Arm backend: Add TOSA VGF encapsulated compilation target.
Add TOSA VGF encapsulated compilation target.
This change enables support for "vgf" files which wrap TOSA output and include memory planning for target devices which can JIT TOSA to the target ISA on-device.
- Add a VgfQuantizer (same as TOSAQuantizer)
- Add a VgfBackend and VgfPartitioner to produce TOSA wrapped in a VGF
- Requires yet to be released converter_backend
Signed-off-by: Rob Elliott [email protected] Change-Id: I764c32c33c503eb44200e9a7d98caa8fae8a4882
Test plan
As this is a new encapsulation with a tool that's not yet released, integration with unit tests will come in a subsequent commit.
cc @digantdesai @freddan80 @per @zingo @oscarandersson8218
:link: Helpful Links
:test_tube: See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/10476
- :page_facing_up: Preview Python docs built from this PR
Note: Links to docs will display an error until the docs builds have been completed.
:hourglass_flowing_sand: 2 Pending, 1 Unrelated Failure
As of commit cc9684baa8f1da227a8701c2c6844b0d4532e00c with merge base e5874fa4a1815f35230eaedc4eea58c1c53baca2 ():
FLAKY - The following job failed but was likely due to flakiness present on trunk:
- pull / test-llava-runner-linux / linux-job (gh) (matched linux rule in flaky-rules.json)
The process '/usr/bin/git' failed with exit code 128
This comment was automatically generated by Dr. CI and updates every 15 minutes.
Hi @robell this seems to need a proper rebase after latest merges to main in the build /test scrips. I tried an "Update branch" here in GitHub but get fails in the Arm runners, so it probably needs a proper check.
Hi @robell this seems to need a proper rebase after latest merges to main in the build /test scrips. I tried an "Update branch" here in GitHub but get fails in the Arm runners, so it probably needs a proper check.
Done now - there was some minor issue with restructuring of the quantizers/ import behaviours.
@zingo - can you review please? the internal review's done so it should just be sign-off for merge.
llava/llama fails are unrelated!
@robell - This is an exciting development. Let's update the readme once we have some substance to guide oblivious visitor like me :p
Also can you please tag @SS-JIA as well for the future PRs? Just so we can "stay in the loop" for the development of this new backend. 🙏
Again pumped about this, thanks for driving this! :)