Browser Based Hand Position Tracking

2020
This project started in 2020.

This project, though simple and only took me a few hours to do, is still very cool. This project tracks a hand (only one at a time) in a video stream from your webcam and recognizes some simple gestures (touching your thumb with one of the other fingers) and outputs a message accordingly. But the cool part is that it does all of that using only the resources available to your browser locally, and does not send any data outside of your computer. It uses models form the MediaPipe & TensorFlow JavaScript projects for the tracking data, and the GSAP library combined with some CSS for some relatively smooth markings and effects. You can test this project right here below, just press 'Start!' and allow access to your camera (don't worry the video will not leave your computer).
 



Start!
Try touching your thumb with one of the other fingers, like:
Lang/Lib/Pro Version
MediaPipe Handpose Model
TensorFlow TFLite model, TF.js model
GSAP 3.0

Type Web Script
Input Webcam & Hands Gestures
Output Hand Position & Gestures Interpretation
Special Components Webcam
Screenshots
Finger tracking. Finger tracking.
Gesture interpreting. Gesture interpreting.