Human moving direction prediction based on kinect depth sensor

This project exploited Microsoft’s Kinect video camera, which is equipped with depth sensor besides the raw video data, to conduct a real-time prediction of the direction of human being movement. Such core functionality has its strong need to many other higher-level computer vision application. And...

Full description

Saved in:
Bibliographic Details
Main Author: Wei, ZiChen.
Other Authors: Ma Kai Kuang
Format: Final Year Project
Language:English
Published: 2012
Subjects:
Online Access:http://hdl.handle.net/10356/49683
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:This project exploited Microsoft’s Kinect video camera, which is equipped with depth sensor besides the raw video data, to conduct a real-time prediction of the direction of human being movement. Such core functionality has its strong need to many other higher-level computer vision application. And this report describes the design procedure and a detailed explanation of a human moving direction prediction application. The human moving direction prediction has its strong need to many other higher-level computer vision application. For example, the hando procedure, between di erent adjacent cameras within a multi-cameras surveillant system, needs the human moving direction data to determine which camera needs to be activated in the next stage. The traditional method to analyze and predict human movement is mostly based on raw video data and mathematical models. A typical procedure involves obtaining 3-D joint data, performing kinematic analysis [1] and so on. It always requires large amount of calculation and it is di cult to achieve real time processing. Our project is to develop an application to predict the human moving direction based on Microsoft Kinect sensor. The Kinect sensor is equipped with a depth sensor which is able to capture the objects depth data. We could determine the human posture by processing the depth data and nally predict the human moving direction. Compared with the traditional method, ours is more e cient and robuster. We could also achieve real time processing. Furthermore, based on the prediction, the application is able to plot the trajectory of human movement. In addition, the project is to exploited the motion-sensing capability. Users could control the elevation angle of the Kinect sensor by di erent gestures.