The goal of this project is to have the telepresence robot recognize and follow military hand signals. These signals consist of commands such as Halt, Crawl Forward, Run Forward, Retreat, Flank Left/Right, etc. Upon recognizing a gesture the robot will execute a hardcoded response. The robot will be equipped with a Kinect sensor that detects the skeleton of a human. We have also coded a baseline classifier that treats each gesture as a stationary pose and then uses a geometrical KNN algorithm to predict the gesture. The baseline does very well at classifying gestures that are actually stationary and does passably well at classifying gestures that actually involve motion. Our actual system uses the fact that all gestures are relatively sinusoidal and extracts relevant features such as frequency and amplitude that are then used in an SVM to classify the gesture. We are able to consistently classify gestures with over 96% accuracy in randomized offline testing and the robot can recognize all 30 gestures on both arms with our live algorithm.
Related project: Human Activity Detection.
Static Gesture Accuracies
Dynamic Gesture Accuracies
|Point of Entry||0.99|
|Out of Action||0.96|