Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Details on Gesture Recognition Approach #82

Closed
pablovela5620 opened this issue Sep 3, 2019 · 15 comments
Closed

Details on Gesture Recognition Approach #82

pablovela5620 opened this issue Sep 3, 2019 · 15 comments
Assignees
Labels
legacy:hands Hand tracking/gestures/etc type:support General questions

Comments

@pablovela5620
Copy link

The blog gives some details about the algorithm to detect different gestures

We apply a simple algorithm to derive the gestures. First, the state of each finger, e.g. bent or straight, is determined by the accumulated angles of joints. Then we map the set of finger states to a set of pre-defined gestures.

A few questions regarding implementation details

  • Are the accumulated angles based on the 2D pixel values returned from the landmark detection or the 3D values?
  • Are these angles based on an upright hand? aka do you consider then similarly to how you generate the bounding box (wrist joint and MCP joint are aligned)
  • Are these angles from one joint to the next joint in the finger? or from the wrist joint to the finger joint? or just directly from the wrist to the fingertip?
  • Is this all implemented in a new calculator that has not been released? What is the timeline for the release of the gesture tracking?

Thank you for all the help!

@fanzhanggoogle
Copy link

fanzhanggoogle commented Sep 6, 2019

The blog gives some details about the algorithm to detect different gestures

We apply a simple algorithm to derive the gestures. First, the state of each finger, e.g. bent or straight, is determined by the accumulated angles of joints. Then we map the set of finger states to a set of pre-defined gestures.

A few questions regarding implementation details

  • Are the accumulated angles based on the 2D pixel values returned from the landmark detection or the 3D values?
  • Are these angles based on an upright hand? aka do you consider then similarly to how you generate the bounding box (wrist joint and MCP joint are aligned)
  • Are these angles from one joint to the next joint in the finger? or from the wrist joint to the finger joint? or just directly from the wrist to the fingertip?
  • Is this all implemented in a new calculator that has not been released? What is the timeline for the release of the gesture tracking?

Thank you for all the help!

Hi pablovela5620, here's the prompt answers to your questions:

  1. From our experience, 2D landmarks are already enough to produce fairly good results for simple gestures. Using 3D points gives slightly more robust results.
  2. The angles of fingers are independent of the palm pose.
  3. The angles of PIP and DIP are computed from the joint and two neighboring joints, e.g. <MCP, PIP, DIP> for PIP. The angle of MCP is computed from <wrist, MCP, PIP>.
  4. Yes, it's just a another calculator takes in the landmarks. Currently, we do not have plan to release the calculator since it's easy to implement and we don't have a clear target set of gestures in mind.

@fanzhanggoogle fanzhanggoogle added the legacy:hands Hand tracking/gestures/etc label Sep 6, 2019
@jiuqiant jiuqiant added the type:support General questions label Sep 6, 2019
@jiuqiant jiuqiant closed this as completed Sep 7, 2019
@pablovela5620
Copy link
Author

Thank you, FanZhang. I appreciate the response

@dyaeger12345
Copy link

A tutorial/instructions for implementing gesture recognition would be very helpful. It may seem simple to those who developed the project, but for someone coming in from the outside, it seems strange to have advertised it but to not provide at least some detailed instructions. Thanks!

@lisbravo
Copy link

would be very helpful. It may seem simple to those who developed the project, but for someone coming in from the outside, it seems strange to have advertised it but to not provide at least some detailed instructions. Thanks!

+1

@Momohanfeng
Copy link

yes, a tutorial will be helpful for us who is coming from outside, or a detailed instructions could let us write the solution for gesture recognizer

@gabrielstuff
Copy link

hello @pablovela5620,
As your question is very similar to #40 I was wondering what you end up with, and what you think about the proposal solution explained in the #40 reply.

Thanks !

@Niko-La
Copy link

Niko-La commented Nov 27, 2019

+1 Tutorial on Gesture Recognizer

@lisbravo
Copy link

+1 Tutorial on Gesture Recognizer

I'm working on something very similar to a gesture recognition tutorial and will release it in 7-10 days ;)

@ChuyiZhong
Copy link

+1 Tutorial on Gesture Recognizer

I'm working on something very similar to a gesture recognition tutorial and will release it in 7-10 days ;)

Could you share it with the community? I think lots of people are having difficulties on this issue and we will all appreciate that. Thanks in advance.

@lisbravo
Copy link

lisbravo commented Dec 5, 2019

+1 Tutorial on Gesture Recognizer

I'm working on something very similar to a gesture recognition tutorial and will release it in 7-10 days ;)

Could you share it with the community? I think lots of people are having difficulties on this issue and we will all appreciate that. Thanks in advance.

Sure, please wait a little bit while Im finishing it

@KhuongAnNguyen
Copy link

A tutorial would be really helpful :) Thanks

@Jaguaribe21
Copy link

Please tutorial! Thanks

@TheJLifeX
Copy link

TheJLifeX commented Jan 11, 2020

I have written this

Tutorial: Simple Hand Gesture Recognition.

You can recognize ONE, TWO, TREE, FOUR, FIVE, SIX, YEAH, ROCK, SPIDERMAN and OK like on this blog : On-Device, Real-Time Hand Tracking with MediaPipe.
Thank you for this framework.

hand-gesture

@lisbravo
Copy link

lisbravo commented Apr 9, 2020

@Jaguaribe21 @KhuongAnNguyen @ChuyiZhong @Niko-La @Momohanfeng @dyaeger12345 @pablovela5620
Hey guys, sorry if it tooked me too long to reply. A few days ago I launched an entire open source framework for gesture recognizing and device control, with complete process documentation that you can also use as a tutorial in case you want to implement your own, check:

#594

Project site:
https://www.deuxexsilicon.com/handcommander/
and the code is at:
https://github.com/lisbravo/myMediaPipe

@laisuki1109
Copy link

Hey guys, I modify the TensorFlow demo app, so it can run the palm model and hand landmark model.
please go https://github.com/laisuki1109/handtracking-with-Mediapipe to explore more !
Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
legacy:hands Hand tracking/gestures/etc type:support General questions
Projects
None yet
Development

No branches or pull requests