fingerpose-gestures is a library built on top of Fingerpose classifier for hand landmarks detected by TensorFlow.js' handpose model. It can detect hand gestures inside a webcam source picture.
Gesture detection works in 3 steps:
- Detect the hand landmarks inside the video picture
- Estimating the direction and curl of each individual finger
- Comparing the result to a set of gesture description
Step (1) is performed by TensorFlow's "handpose", Step (2) and (3) are handled by fingerpose library.
Install the module via NPM:
npm i --save fingerpose-gestures
A fully working example can be found inside the demo
folder. The basic steps are outlined below:
import * as tf from "@tensorflow/tfjs";
import * as handpose from "@tensorflow-models/handpose";
import "@tensorflow/tfjs-backend-webgl";
import * as fp from "fingerpose";
import * as fpg from "fingerpose-gestures";
const GE = new fp.GestureEstimator([
// add "✌🏻" and "👍" as sample gestures
fp.Gestures.VictoryGesture,
fp.Gestures.ThumbsUpGesture,
// add other gestures by fingerpose-gestures
fpg.Gestures.thumbsDownGesture,
fpg.Gestures.fingerSplayedGesture,
fpg.Gestures.raisedHandGesture
// ... and more
]);
const model = await handpose.load();
const predictions = await model.estimateHands(video, true);
// using a minimum confidence of 7.5 (out of 10)
const estimatedGestures = GE.estimate(predictions.landmarks, 7.5);
The result is an object containing possible gestures and their confidence, for example:
{
"poseData": [ ... ],
"gestures": [
{ "name": "thumbs_up", "confidence": 9.25 },
{ ... }
]
}
name | emoji | path |
---|---|---|
thumbs_up | 👍 | fpg.Gestures.thumbsUpGesture |
victory | ✌ | fpg.Gestures.victoryGesture |
thumbs_down | 👎 | fpg.Gestures.thumbsDownGesture |
finger_splayed | 🖐 | fpg.Gestures.fingerSplayedGesture |
raised_hand | ✋ | fpg.Gestures.raisedHandGesture |
pinching | 🤏 | fpg.Gestures.pinchingGesture |
ok | 👌 | fpg.Gestures.okGesture |
fist | ✊ | fpg.Gestures.fistGesture |