onnxruntime icon indicating copy to clipboard operation
onnxruntime copied to clipboard

CMake exports for static onnxruntime

Open jordanozang opened this issue 1 year ago • 9 comments

Description

See the example repository for a minimal example of using the static CMake Config.

  • Add libraries built during the static onnxruntime build (onnxruntime_common, onnxruntime_mlas, etc.) to the onnxruntime export set. Additionally, add an onnxruntime::onnxruntime interface target that behaves much the same as that target in the shared build case. find_package(onnxruntime REQUIRED) target_link_libraries(example PRIVATE onnxruntime::onnxruntime) should now work.
  • Add onnxruntime_USE_EIGEN_PACKAGE option to enable getting Eigen via find_package. It cannot be reliably added to the FIND_PACKAGE_ARGS because onnxruntime depends on a version of Eigen that is between releases. It is intended to be used in a controlled environment like vcpkg. In the example repo, I overrode the vcpkg port to fetch the same version as onnxruntime does.
  • Pin Protobuf version in the FIND_PACKAGE_ARGS to 3.21 because incompatible versions cause a build failure.
  • Add a case to handle systems like Arch Linux where the target nsync_cpp is imported rather than unofficial::nsync::nsync_cpp, as with vcpkg. Increment the nsync_cpp fetchcontent version from 1.26.0 to 1.29.0 to take advantage of new export targets upstream.
  • Minor modifications to ensure that dependency targets like Boost::mp11 are treated as imported targets and not part of the build interface.

Motivation and Context

  • Resolves Issue #21351
  • Resolves Issue #22092
  • Builds on Pull Request #21348

Edit: Abseil version was pinned by another pr in FIND_PACKAGE_ARGS so this no longer changes that.

jordanozang avatar Sep 23 '24 00:09 jordanozang

@microsoft-github-policy-service agree

-------- Message d'origine -------- Le 2024-09-22 20 h 24, microsoft-github-policy-service[bot] a écrit :

@.***(https://github.com/jordanozang) please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.

@microsoft-github-policy-service agree [company="{your company}"]

Options:

  • (default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.

@microsoft-github-policy-service agree

  • (when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.

@microsoft-github-policy-service agree company="Microsoft"

Contributor License Agreement

Contribution License Agreement

This Contribution License Agreement (“Agreement”) is agreed to by the party signing below (“You”), and conveys certain license rights to Microsoft Corporation and its affiliates (“Microsoft”) for Your contributions to Microsoft open source projects. This Agreement is effective as of the latest signature date below.

  • Definitions. “Code” means the computer software code, whether in human-readable or machine-executable form, that is delivered by You to Microsoft under this Agreement. “Project” means any of the projects owned or managed by Microsoft and offered under a license approved by the Open Source Initiative (www.opensource.org). “Submit” is the act of uploading, submitting, transmitting, or distributing code or other content to any Project, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Project for the purpose of discussing and improving that Project, but excluding communication that is conspicuously marked or otherwise designated in writing by You as “Not a Submission.” “Submission” means the Code and any other copyrightable material Submitted by You, including any associated comments and documentation.

  • Your Submission. You must agree to the terms of this Agreement before making a Submission to any Project. This Agreement covers any and all Submissions that You, now or in the future (except as described in Section 4 below), Submit to any Project.

  • Originality of Work. You represent that each of Your Submissions is entirely Your original work. Should You wish to Submit materials that are not Your original work, You may Submit them separately to the Project if You (a) retain all copyright and license information that was in the materials as You received them, (b) in the description accompanying Your Submission, include the phrase “Submission containing materials of a third party:” followed by the names of the third party and any licenses or other restrictions of which You are aware, and (c) follow any other instructions in the Project’s written guidelines concerning Submissions.

  • Your Employer. References to “employer” in this Agreement include Your employer or anyone else for whom You are acting in making Your Submission, e.g. as a contractor, vendor, or agent. If Your Submission is made in the course of Your work for an employer or Your employer has intellectual property rights in Your Submission by contract or applicable law, You must secure permission from Your employer to make the Submission before signing this Agreement. In that case, the term “You” in this Agreement will refer to You and the employer collectively. If You change employers in the future and desire to Submit additional Submissions for the new employer, then You agree to sign a new Agreement and secure permission from the new employer before Submitting those Submissions.

  • Licenses.

  • Copyright License. You grant Microsoft, and those who receive the Submission directly or indirectly from Microsoft, a perpetual, worldwide, non-exclusive, royalty-free, irrevocable license in the Submission to reproduce, prepare derivative works of, publicly display, publicly perform, and distribute the Submission and such derivative works, and to sublicense any or all of the foregoing rights to third parties.

  • Patent License. You grant Microsoft, and those who receive the Submission directly or indirectly from Microsoft, a perpetual, worldwide, non-exclusive, royalty-free, irrevocable license under Your patent claims that are necessarily infringed by the Submission or the combination of the Submission with the Project to which it was Submitted to make, have made, use, offer to sell, sell and import or otherwise dispose of the Submission alone or with the Project.

  • Other Rights Reserved. Each party reserves all rights not expressly granted in this Agreement. No additional licenses or rights whatsoever (including, without limitation, any implied licenses) are granted by implication, exhaustion, estoppel or otherwise.

  • Representations and Warranties. You represent that You are legally entitled to grant the above licenses. You represent that each of Your Submissions is entirely Your original work (except as You may have disclosed under Section 3). You represent that You have secured permission from Your employer to make the Submission in cases where Your Submission is made in the course of Your work for Your employer or Your employer has intellectual property rights in Your Submission by contract or applicable law. If You are signing this Agreement on behalf of Your employer, You represent and warrant that You have the necessary authority to bind the listed employer to the obligations contained in this Agreement. You are not expected to provide support for Your Submission, unless You choose to do so. UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING, AND EXCEPT FOR THE WARRANTIES EXPRESSLY STATED IN SECTIONS 3, 4, AND 6, THE SUBMISSION PROVIDED UNDER THIS AGREEMENT IS PROVIDED WITHOUT WARRANTY OF ANY KIND, INCLUDING, BUT NOT LIMITED TO, ANY WARRANTY OF NONINFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE.

  • Notice to Microsoft. You agree to notify Microsoft in writing of any facts or circumstances of which You later become aware that would make Your representations in this Agreement inaccurate in any respect.

  • Information about Submissions. You agree that contributions to Projects and information about contributions may be maintained indefinitely and disclosed publicly, including Your name and other information that You submit with Your Submission.

  • Governing Law/Jurisdiction. This Agreement is governed by the laws of the State of Washington, and the parties consent to exclusive jurisdiction and venue in the federal courts sitting in King County, Washington, unless no federal subject matter jurisdiction exists, in which case the parties consent to exclusive jurisdiction and venue in the Superior Court of King County, Washington. The parties waive all defenses of lack of personal jurisdiction and forum non-conveniens.

  • Entire Agreement/Assignment. This Agreement is the entire agreement between the parties, and supersedes any and all prior agreements, understandings or communications, written or oral, between the parties relating to the subject matter hereof. This Agreement may be assigned by Microsoft.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>

jordanozang avatar Sep 23 '24 03:09 jordanozang

I think @snnn probably has the most context on this. Hopefully tagging is ok. I don't think I have permission to add reviewers as suggested in the contributing guidelines.

jordanozang avatar Sep 27 '24 19:09 jordanozang

@skottmckay Hey, may you please merge this?

zwillikon avatar Oct 14 '24 17:10 zwillikon

/azp run Big Models, Linux Android Emulator QNN CI Pipeline, Linux CPU CI Pipeline, Linux CPU Minimal Build E2E CI Pipeline, Linux GPU CI Pipeline, Linux GPU TensorRT CI Pipeline

snnn avatar Oct 15 '24 17:10 snnn

/azp run Linux OpenVINO CI Pipeline, Linux QNN CI Pipeline, MacOS CI Pipeline, ONNX Runtime Web CI Pipeline, Windows ARM64 QNN CI Pipeline,

snnn avatar Oct 15 '24 17:10 snnn

/azp run Windows CPU CI Pipeline, Windows GPU CUDA CI Pipeline, Windows GPU DML CI Pipeline, Windows GPU Doc Gen CI Pipeline, Windows GPU TensorRT CI Pipeline, Windows x64 QNN CI Pipeline, onnxruntime-binary-size-checks-ci-pipeline, orttraining-linux-ci-pipeline, orttraining-linux-gpu-ci-pipeline,

snnn avatar Oct 15 '24 17:10 snnn

Azure Pipelines successfully started running 6 pipeline(s).

azure-pipelines[bot] avatar Oct 15 '24 17:10 azure-pipelines[bot]

Azure Pipelines successfully started running 5 pipeline(s).

azure-pipelines[bot] avatar Oct 15 '24 17:10 azure-pipelines[bot]

Azure Pipelines successfully started running 9 pipeline(s).

azure-pipelines[bot] avatar Oct 15 '24 17:10 azure-pipelines[bot]

/azp run Big Models, Linux Android Emulator QNN CI Pipeline, Linux CPU CI Pipeline, Linux CPU Minimal Build E2E CI Pipeline, Linux GPU CI Pipeline, Linux GPU TensorRT CI Pipeline

snnn avatar Oct 22 '24 14:10 snnn

/azp run Linux OpenVINO CI Pipeline, Linux QNN CI Pipeline, MacOS CI Pipeline, ONNX Runtime Web CI Pipeline, Windows ARM64 QNN CI Pipeline

snnn avatar Oct 22 '24 14:10 snnn

/azp run Windows CPU CI Pipeline, Windows GPU CUDA CI Pipeline, Windows GPU DML CI Pipeline, Windows GPU Doc Gen CI Pipeline, Windows GPU TensorRT CI Pipeline, Windows x64 QNN CI Pipeline, onnxruntime-binary-size-checks-ci-pipeline, orttraining-linux-ci-pipeline, orttraining-linux-gpu-ci-pipeline

snnn avatar Oct 22 '24 14:10 snnn

Azure Pipelines successfully started running 6 pipeline(s).

azure-pipelines[bot] avatar Oct 22 '24 14:10 azure-pipelines[bot]

Azure Pipelines successfully started running 5 pipeline(s).

azure-pipelines[bot] avatar Oct 22 '24 14:10 azure-pipelines[bot]

Azure Pipelines successfully started running 9 pipeline(s).

azure-pipelines[bot] avatar Oct 22 '24 14:10 azure-pipelines[bot]

/azp run Big Models, Linux Android Emulator QNN CI Pipeline, Linux CPU CI Pipeline, Linux CPU Minimal Build E2E CI Pipeline, Linux GPU CI Pipeline, Linux GPU TensorRT CI Pipeline

snnn avatar Oct 22 '24 21:10 snnn

/azp run Linux OpenVINO CI Pipeline, Linux QNN CI Pipeline, MacOS CI Pipeline, ONNX Runtime Web CI Pipeline, Windows ARM64 QNN CI Pipeline

snnn avatar Oct 22 '24 21:10 snnn

/azp run Windows CPU CI Pipeline, Windows GPU CUDA CI Pipeline, Windows GPU DML CI Pipeline, Windows GPU Doc Gen CI Pipeline, Windows GPU TensorRT CI Pipeline, Windows x64 QNN CI Pipeline, onnxruntime-binary-size-checks-ci-pipeline, orttraining-linux-ci-pipeline, orttraining-linux-gpu-ci-pipeline

snnn avatar Oct 22 '24 21:10 snnn

Azure Pipelines successfully started running 6 pipeline(s).

azure-pipelines[bot] avatar Oct 22 '24 21:10 azure-pipelines[bot]

Azure Pipelines successfully started running 5 pipeline(s).

azure-pipelines[bot] avatar Oct 22 '24 21:10 azure-pipelines[bot]

Azure Pipelines successfully started running 9 pipeline(s).

azure-pipelines[bot] avatar Oct 22 '24 21:10 azure-pipelines[bot]

/azp run Big Models, Linux Android Emulator QNN CI Pipeline, Linux CPU CI Pipeline, Linux CPU Minimal Build E2E CI Pipeline, Linux GPU CI Pipeline, Linux GPU TensorRT CI Pipeline

snnn avatar Oct 22 '24 23:10 snnn

/azp run Linux OpenVINO CI Pipeline, Linux QNN CI Pipeline, MacOS CI Pipeline, ONNX Runtime Web CI Pipeline, Windows ARM64 QNN CI Pipeline

snnn avatar Oct 22 '24 23:10 snnn

/azp run Windows CPU CI Pipeline, Windows GPU CUDA CI Pipeline, Windows GPU DML CI Pipeline, Windows GPU Doc Gen CI Pipeline, Windows GPU TensorRT CI Pipeline, Windows x64 QNN CI Pipeline, onnxruntime-binary-size-checks-ci-pipeline, orttraining-linux-ci-pipeline, orttraining-linux-gpu-ci-pipeline

snnn avatar Oct 22 '24 23:10 snnn

Azure Pipelines successfully started running 6 pipeline(s).

azure-pipelines[bot] avatar Oct 22 '24 23:10 azure-pipelines[bot]

Azure Pipelines successfully started running 5 pipeline(s).

azure-pipelines[bot] avatar Oct 22 '24 23:10 azure-pipelines[bot]

Azure Pipelines successfully started running 9 pipeline(s).

azure-pipelines[bot] avatar Oct 22 '24 23:10 azure-pipelines[bot]

/azp run Big Models, Linux Android Emulator QNN CI Pipeline, Linux CPU CI Pipeline, Linux CPU Minimal Build E2E CI Pipeline, Linux GPU CI Pipeline, Linux GPU TensorRT CI Pipeline

snnn avatar Oct 25 '24 18:10 snnn

/azp run Linux OpenVINO CI Pipeline, Linux QNN CI Pipeline, MacOS CI Pipeline, ONNX Runtime Web CI Pipeline, Windows ARM64 QNN CI Pipeline, Windows CPU CI Pipeline, Windows GPU CUDA CI Pipeline

snnn avatar Oct 25 '24 18:10 snnn

/azp run Windows GPU DML CI Pipeline, Windows GPU Doc Gen CI Pipeline, Windows GPU TensorRT CI Pipeline, Windows x64 QNN CI Pipeline

snnn avatar Oct 25 '24 18:10 snnn