yoha icon indicating copy to clipboard operation
yoha copied to clipboard

Non-standard hand support

Open 3digitdev opened this issue 2 years ago • 2 comments

Hey there! Not sure if this is an "Issue" but seems like the best way to communicate with the project for now.

I have....well let's go with "non-standard" hands 😂

I figure hand-tracking systems like this are obviously trained on very standard hands, as one would expect. No judgement from me! Best way to start.

I would just like to mention that I'm intensely interested in this project, for its possibility of helping support VR/AR hand tracking in the future.

If in the future this project wants to support more non-standard hands to auto-detect different finger layouts etc, I am a very good test subject -- I have asymmetrical hands with varying sizes and non-standard layouts across the different hands, including a paralyzed finger! If you are training against data sets, I would be happy to (with a set of instructions), provide both training data using my own hands, and code help (as a fellow dev) for this project in the future!

Please don't take this as insistence that this be done early. I realize that this project needs to get off the ground with standard training data first. But in the future....well, keep me in mind!

3digitdev avatar Oct 12 '21 00:10 3digitdev

Hey,

thank you for your feedback and for creating this issue. I hope this can be provided in the future. I believe there are three obstacles. I ordered them according to my subjective perceived relevance:

  1. It is way more difficult to collect training data for these cases as abnormalities are by definition rare.
  2. Because there is a smaller audience for which this is relevant the implementation is in some sense very expensive. So much so that I think the most realistic way this can be done is to first gather more resources so that such an undertaking can be accomplished. That means either sufficient community builds around the project or a company jumps in to provide support for the development of this feature.
  3. The hand tracking becomes more complicated so the pressure on the neural network increases.

In general, if somebody is reading this and wants to contribute training data for this direction (it's super simple process for you) please create a comment in this issue (or if you have no GH account e-mail [email protected]).

I believe that there might also be some interesting research questions in this direction. E.g. can you use some kind of data augmentation to turn training data of whole hands into (approximate) training data for incomplete hands?

Thoughts anybody?

b-3-n avatar Oct 12 '21 11:10 b-3-n

100% agree with everything in your response here, appreciate the response. I had no intention to make it seem like this should be easy or done soon, but I'm happy to help out. Feel free to share with me the process for contributing training data! I will happily share.

Side-note: I tried the demo out, and one behavior I noticed is that where it expected extra fingers to be that were missing, it would put them fully extended but basically touching. This created a "second pinch point" that made the actual pinch gesture less accurate in the end, as it would jump between that point, and the normal pinch point. Not sure if that helps, but it's an observation!

3digitdev avatar Oct 13 '21 01:10 3digitdev