pytorch_mpiigaze_demo icon indicating copy to clipboard operation
pytorch_mpiigaze_demo copied to clipboard

Gaze estimation using MPIIGaze and MPIIFaceGaze

Results 20 pytorch_mpiigaze_demo issues
Sort by recently updated
recently updated
newest added

Greetings and thanks for this amazing repository. I'd suggest the following changes to requirements as I got these errors Create venv: ``` python 3.7 -m venv .venv source .venv/bin/activate pip...

hi thnks for great working again i just wanna get the on screen gaze point i calculate point of intersection between screen plane and gaze segment(as the source code written...

Hello, I am very interested in gaze tracking, but I am a newbie. How can I input a video and then get the 3D coordinates of the eye centers for...

Hi, I am trying to run the demo but for some reason it only throws me an error. Can anybody help me with this? File "F:\Users\Daniel\Downloads\pytorch_mpiigaze_demo-master\pytorch_mpiigaze_demo-master\ptgaze\demo.py", line 11, in from...

Hello is it possible to save the eye gaze and head position data in some csv file?

I understand that matrix sample_params.yaml is the internal parameter matrix of the camera, but what does matrix eth-xgaze.yaml do? ![image](https://user-images.githubusercontent.com/26705398/236776104-73c008ca-4662-495c-9c22-3e5140299423.png)

It looks like in `generate_dummy_camera_params`, when generating the camera matrix, the focal length is assumed to be the `width`: https://github.com/hysts/pytorch_mpiigaze_demo/blob/47cdf68414d20c8281bbb0a03112a298761aaa9b/ptgaze/utils.py#L125 and I am curious where that logic came from? Ideally...

Hi, first of all: thank you for your fantastic work! I'd like to integrate gaze estimation in my project but it's a tough job since I have to build an...

Hello Hysts. First of all, great work. I am using your architecture using Resnet and MPIIFaceGaze. Would you have a simple diagram of your architecture? Thanks.