Valentyn Stadnytskyi
Valentyn Stadnytskyi
> The Azure Kinect team just released a high fidelity model - https://github.com/microsoft/Azure-Kinect-Sensor-SDK/tree/develop/assets. > > This needs to be converted to dae, and added as a mesh resource to the...
> I submitted a PR (#60) a couple months ago that does this. > > Just set the environment variable `EGL_DEVICE_ID` e.g. > > ``` > export EGL_DEVICE_ID=1 > python...
And also it is very nicely explained in the paper - https://arxiv.org/pdf/2211.02648.pdf  If I understand it correctly, we get X,Y,Z coordinates. Why do we need to more vectors -...
This would be quite handy however I do not think research mode provides this capability judging by the original research paper https://arxiv.org/abs/2008.11239 
I have not looked into the library you are referring to. However, I can say that the current library(HL2ss) is super easy to use and I find it quite intuitive....
Answer to this question is in the original HoloLens2 research paper. https://arxiv.org/pdf/2008.11239.pdf 