[ET-VK] Make libtorch optional in custom op test binaries#19433
[ET-VK] Make libtorch optional in custom op test binaries#19433pytorchbot wants to merge 1 commit intomainfrom
Conversation
Add `include_torch` parameter (default False) to `define_custom_op_test_binary()`. None of the custom op test binaries directly include torch/ATen/c10 headers, so libtorch was unnecessary baggage. Dropping it reduces the q4gsw_linear_adreno binary from ~1 GB to 74 MB. Differential Revision: [D104456804](https://our.internmc.facebook.com/intern/diff/D104456804/) ghstack-source-id: 379498992 Pull Request resolved: #19402
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/19433
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ⏳ No Failures, 1 PendingAs of commit 24de0f7 with merge base c564936 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This PR needs a
|
This PR was created by the merge bot to help merge the original PR into the main branch.
ghstack PR number: #19402 by @SS-JIA
^ Please use this as the source of truth for the PR details, comments, and reviews
ghstack PR base: https://github.com/pytorch/executorch/tree/gh/SS-JIA/529/base
ghstack PR head: https://github.com/pytorch/executorch/tree/gh/SS-JIA/529/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/main
Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/SS-JIA/529/orig
Differential Revision: D104456804
@diff-train-skip-merge