| Detection Project Overview | Anticipation Project Overview | Data/Code | Results |

Human Activity Detection from RGBD Images

activity sample images

Being able to detect and recognize human activities is essential for several applications, including smart homes and personal assistive robotics. In this paper, we perform detection and recognition of unstructured human activity in unstructured environments. We use a RGBD sensor (Microsoft Kinect) as the input sensor, and compute a set of features based on human pose and motion, as well as based on image and point-cloud information.





Popular Press

E&T Magazine, Phys.org, R&D Magazine, Gizmag, GizmoWatch, myScience, WonderHowTo, Geekosystem.


Data/Code

Download Cornell Activity Datasets and Code


Results

Check out latest results on Cornell Activity Dataset 60.


Publications

Learning Human Activities and Object Affordances from RGB-D Videos, Hema S Koppula, Rudhir Gupta, Ashutosh Saxena. International Journal of Robotics Research (IJRR), in press, Jan 2013. [PDF] [CAD-120 Dataset]

Unstructured Human Activity Detection from RGBD Images, Jaeyong Sung, Colin Ponce, Bart Selman, Ashutosh Saxena. International Conference on Robotics and Automation (ICRA), 2012. [PDF] [CAD-60 Dataset]

Human Activity Detection from RGBD Images, Jaeyong Sung, Colin Ponce, Bart Selman, Ashutosh Saxena. In AAAI workshop on Pattern, Activity and Intent Recognition (PAIR), 2011. [PDF] [CAD-60 Dataset]


People

Jaeyong Sungjysung at cs.cornell.edu
Hema Koppulahema at cs.cornell.edu
Prof. Bart Selmanselman at cs.cornell.edu
Prof. Ashutosh Saxenaasaxena at cs.cornell.edu

Videos



Related Projects

Anticipating Human Activities

Hand Gesture Recognition

3D Scene Understanding