Use PTEFile class in serialize_pte_binary
Stack from ghstack (oldest at bottom):
- -> #15801
- #15800
Take in a PTE file for serialization instead of program, mutable_segments, named_data segments separately.
Differential Revision: D86908241
:link: Helpful Links
:test_tube: See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/15801
- :page_facing_up: Preview Python docs built from this PR
Note: Links to docs will display an error until the docs builds have been completed.
:x: 6 New Failures, 3 Unrelated Failures
As of commit 41d4aca8ace8705ac821c8443b23c2aabc503e02 with merge base b1e3e28bb611e06d484138be27221faffd89f565 ():
NEW FAILURES - The following jobs have failed:
-
Test CUDA Builds / test-model-cuda-e2e (openai, whisper-large-v3-turbo, non-quantized) / linux-job (gh)
RuntimeError: Command docker exec -t ddec3b7b36182660b93524e8a1b2dbbe9e229abdb0afa12a950e2d93f24e6fbb /exec failed with exit code 1 -
Test CUDA Builds / test-model-cuda-e2e (openai, whisper-large-v3-turbo, quantized-int4-tile-packed) / linux-job (gh)
RuntimeError: Command docker exec -t f689e44f85f58355d9a03abdf0255a6cfeae13e2cc2b48518bd8de582a9e019b /exec failed with exit code 1 -
Test CUDA Builds / test-model-cuda-e2e (openai, whisper-large-v3-turbo, quantized-int4-weight-only) / linux-job (gh)
RuntimeError: Command docker exec -t 64948aeafc08e61f2636ea863a3e4cdd394f8778201dcab8383d3b00c3871b5c /exec failed with exit code 1 -
Test CUDA Builds / test-model-cuda-e2e (openai, whisper-small, non-quantized) / linux-job (gh)
RuntimeError: Command docker exec -t f7423ee5f2e1d34782357a12f847c8f616c5b9b97fafe5345e25ef04f0f39a94 /exec failed with exit code 1 -
Test CUDA Builds / test-model-cuda-e2e (openai, whisper-small, quantized-int4-tile-packed) / linux-job (gh)
RuntimeError: Command docker exec -t f111cc914189cf941cb679b21c87b775c8f98ad0bfc6e063dbd0033f6927c69e /exec failed with exit code 1 -
Test CUDA Builds / test-model-cuda-e2e (openai, whisper-small, quantized-int4-weight-only) / linux-job (gh)
RuntimeError: Command docker exec -t 51b8a675db6fe40817374c90b1a812902068c672c8bd871a431dce80552d3306 /exec failed with exit code 1
FLAKY - The following jobs failed but were likely due to flakiness present on trunk:
-
pull / test-llama-runner-linux (fp32, xnnpack+custom+quantize_kv, linux.arm64.2xlarge, executorch-ubuntu... / linux-job (gh) (matched linux rule in flaky-rules.json)
The runner has received a shutdown signal. This can happen when the runner service is stopped, or a manually started runner is canceled. -
Test Metal Backend / test-model-metal-e2e (openai, whisper-small, non-quantized) / macos-job (gh) (similar failure)
RuntimeError: Command bash /Users/ec2-user/runner/_work/_temp/exec_script failed with exit code 1
BROKEN TRUNK - The following job failed but were present on the merge base:
👉 Rebase onto the `viable/strict` branch to avoid these failures
-
Test Metal Backend / test-model-metal-e2e (openai, whisper-large-v3-turbo, non-quantized) / macos-job (gh) (trunk failure)
RuntimeError: Command bash /Users/ec2-user/runner/_work/_temp/exec_script failed with exit code 1
This comment was automatically generated by Dr. CI and updates every 15 minutes.
This PR needs a release notes: label
If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.
To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"
For more information, see https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.