visionOS-2-Object-Tracking-Demo
visionOS-2-Object-Tracking-Demo copied to clipboard
visionOS 2 + Object Tracking + ARKit means: we can create visual highlights of real world objects around us and have those visualizations respond to the proximity of our hands.
visionOS 2 Object Tracking Demo
visionOS 2 + Object Tracking + ARKit means: we can create visual highlights of real world objects around us and have those visualizations respond to the proximity of our hands.
This project is largely a quick repurposing and combining of Apple's Scene Reconstruction sample project (which utilizes ARKit's HandTrackingProvider) and Object Tracking sample project.
The full demo video with sound is here.
Some details about putting together this demo are over here.
Build Instructions
- Choose your Apple Developer Account in: Signing & Capabilities
- Build
Models Used in This Project
I live in Chicago and purchased the cereal and milk at a local Jewel in June 2024 – your local packaging may vary and prevent recognition. The three products used are:
Using Your Own Models
If you want to strip out the three bundled objects and use your own:
- You will need to train on a
.udszfile to create a.referenceObject, I recommend using Apple's Object Capture sample project to create a.usdzfile of your object - You will need to use Create ML (version 6, or higher, which comes bundled with Xcode 16) to train a
.referenceObjectfrom your.usdz, for me this process has taken anywhere from 4 - 16 hours per.referenceObject - You will need to bundle your new
.referenceObjectin the Xcode project - You will need to coordinate the naming of your new
.referenceObjectwith the demo'sObjectTypeenum so everything plays nicely together