[ET-VK] Add etvk.use_existing_vma config to avoid duplicate VMA symbols#18797
[ET-VK] Add etvk.use_existing_vma config to avoid duplicate VMA symbols#18797
Conversation
Pull Request resolved: #18522 Apps that already link VulkanMemoryAllocatorInstantiated (e.g. Stella, via IGL or Diamond/Skia) get duplicate symbol errors when also linking ExecuTorch's Vulkan backend, because vma_api.cpp defines VMA_IMPLEMENTATION independently. Add a Buck config flag `etvk.use_existing_vma=1` that: - Defines ETVK_USE_META_VMA, which makes vma_api.h match the third-party VulkanMemoryAllocatorInstantiated config (Vulkan 1.2, dynamic function loading) so struct layouts agree - Skips VMA_IMPLEMENTATION in vma_api.cpp so no duplicate definitions are emitted - Swaps the Buck dep from VulkanMemoryAllocator_xplat (header-only) to VulkanMemoryAllocatorInstantiated (pre-compiled) Off by default — no behavior change for existing builds or OSS. ghstack-source-id: 364855830 @exported-using-ghexport Differential Revision: [D98250268](https://our.internmc.facebook.com/intern/diff/D98250268/)
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18797
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 1 New Failure, 2 Pending, 2 Unrelated FailuresAs of commit c01686c with merge base 0ee0f67 ( NEW FAILURE - The following job has failed:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
BROKEN TRUNK - The following job failed but was present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This PR needs a
|
This PR was created by the merge bot to help merge the original PR into the main branch.
ghstack PR number: #18522 by @SS-JIA
^ Please use this as the source of truth for the PR details, comments, and reviews
ghstack PR base: https://github.com/pytorch/executorch/tree/gh/SS-JIA/512/base
ghstack PR head: https://github.com/pytorch/executorch/tree/gh/SS-JIA/512/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/main
Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/SS-JIA/512/orig
Differential Revision: D98250268
@diff-train-skip-merge