hl2ss icon indicating copy to clipboard operation
hl2ss copied to clipboard

Gaze location in PV (u, v) coordinates

Open TimSchoonbeek opened this issue 2 years ago • 3 comments

I am just starting to familiarize myself with the repo, nice work you have done!

Is there a demo where the gaze is read from the HL2 and projected back onto the PV sensor? E.g., like client_si.py projects the hands onto the PV feed, is there something to reproject the gaze to this PV feed as well?

Thanks in advance.

TimSchoonbeek avatar Feb 16 '23 15:02 TimSchoonbeek

Hello,

We don't have a demo that projects gaze to PV. Gaze is given as origin (3D point) + direction (3D ray) so it projects to a line in PV frames. To project gaze location to PV, the length of the gaze ray should be obtained via raycasting (as in open3d_viewer_si.py), then the gaze 3D point can be computed as origin + length*direction, and finally that 3D point would be projected to PV using the world to camera and intrinsic matrices (as in client_pv_si.py).

jdibenes avatar Feb 18 '23 03:02 jdibenes

Thank you for confirming that I did not miss it. I will look at implementing it, and I guess I can contribute it here once it is working.

TimSchoonbeek avatar Feb 20 '23 15:02 TimSchoonbeek

If I understand the raycast correctly, I need to use depth data for this, like in open3d_viewer_si.py. Then we can cast a ray from the origin with the direction stored in eye_ray, and see when it hits anything. I am able to do this with the given code. However, the depth data is only available at 5FPS, the gaze (and hand) is available at 60FPS. Additionally, building the whole volume is computationally expensive, resulting in a total framerate of only 2 FPS on my system. Am I overlooking something here, or will I not be able to get gaze data at least at the same framerate as the PV sensor?

Also, the Hololens2forCV has the length of the gaze ray already stored in the recording. Is that also a possibility for this repo?

TimSchoonbeek avatar Feb 20 '23 17:02 TimSchoonbeek