react-native-vision-camera
react-native-vision-camera copied to clipboard
How to get depth data from TrueDepth & Lidar❓
Question
Hey,
Don't know if it's doable or if someone can help me.
I'm trying the example lib on my iPhone 13 Pro (With Lidar & TrueDepth front cam) in order to get the depth data. My Goal is to reconstruct in 3D an object based on depth data provided by the cameras. I want to try TrueDepth & Lidar to evaluate accuracy.
First question : How can I get depth data from the picture ? Second question : From a Frame, in Frame Processor ?
What I tried
The example App is running
- I saved a picture taken with the right cam (Lidar or TrueDepth) saved it in my mac and open it (couldn't find depth data)
- I tried to log data from Frame in the FrameProcessor without success.
VisionCamera Version
3.6.4
Additional information
- [ ] I am using Expo
- [X] I have read the Troubleshooting Guide
- [X] I agree to follow this project's Code of Conduct
- [X] I searched for similar questions in the issues page as well as in the discussions page and found none.
Up ;) do someone have a working example of this I can use ? thx
I have the same question. Also on an iPhone 13 Pro, although I am using Expo.
Aha! I see the Camera component has enableDepthData
.
Hey! You should be able to access this data on the native side in a Frame Processor Plugin. I think depth data should be attached to the Frame, if not that'd be a small change on my end. If you need help with this, consider contacting us through my agency: https://margelo.io
@ionflow were you eventually able to access depth data from the frame in a frame processor? Trying to access the depth data now myself as well!
Hi, was the issue resolved? Why it is closed? I could not see the solution here.