Gesture Recognition

Video



Abstract

The goal of this project is to have the telepresence robot recognize and follow military hand signals. These signals consist of commands such as Halt, Crawl Forward, Run Forward, Retreat, Flank Left/Right, etc. Upon recognizing a gesture the robot will execute a hardcoded response. The robot will be equipped with a Kinect sensor that detects the skeleton of a human. We have also coded a baseline classifier that treats each gesture as a stationary pose and then uses a geometrical KNN algorithm to predict the gesture. The baseline does very well at classifying gestures that are actually stationary and does passably well at classifying gestures that actually involve motion. Our actual system uses the fact that all gestures are relatively sinusoidal and extracts relevant features such as frequency and amplitude that are then used in an SVM to classify the gesture. We are able to consistently classify gestures with over 96% accuracy in randomized offline testing and the robot can recognize all 30 gestures on both arms with our live algorithm.

Data

Data files and description available here.

Paper

Paper

Related project: Human Activity Detection.


Offline Testing

Static Gesture Accuracies

Abreast1.0
Enemy1.0
Freeze1.0
Hide1.0
Injury1.0
Pistol1.0
Rifle1.0
Stop1.0
Unknown1.0
Antigesture0.99
Backup0.95
Land0.93
Gas0.90
Watch0.85
Listen0.74

Dynamic Gesture Accuracies

Action1.0
Advance1.0
Attention1.0
Charge1.0
Cover1.0
Crouch1.0
Rally1.0
Shift Fire1.0
Point of Entry0.99
Confused0.98
Hurry0.98
Sneak0.98
Out of Action0.96
Come0.91