Natural gesture based control for quadcopter

Human-Computer Interaction has progressed rapidly throughout the years and the opportunities that it provides are massive. Through Natural User Interface, natural gestures can be used to control a quadcopter. This means that the user himself becomes the controller and is able to send commands to the...

全面介紹

Saved in:
書目詳細資料
主要作者: Chang, Yi Wei
其他作者: Suresh Sundaram
格式: Final Year Project
語言:English
出版: 2015
主題:
在線閱讀:http://hdl.handle.net/10356/65606
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Human-Computer Interaction has progressed rapidly throughout the years and the opportunities that it provides are massive. Through Natural User Interface, natural gestures can be used to control a quadcopter. This means that the user himself becomes the controller and is able to send commands to the quadcopter just by changing the posture of his body. Flying a quadcopter is quite different from controlling a remote controlled car. This is because unlike a car, a quadcopter can travel in three different axes. This is why the Microsoft Kinect is being used, as it is able to obtain depth information from the surroundings. The Microsoft Kinect is used to track the skeleton data of the user in order to determine the current posture of the user. Each posture is then classified as a specific command which is then sent to the quadcopter, which in this case is the Parrot AR.Drone 2.0. Certain thresholds were set for each posture so that there is a reasonable amount of leeway provided. This ensures that the user does not accidentally issue a command that he did not intend to do. The commands were sent to the quadcopter using Wi-Fi over the User Datagram Protocol to reduce bandwidth overhead and latency. This enables the quadcopter to respond instantly to the commands given by the user.