A Robotic platform for human-robot interaction based on human gestures for play therapy
Children diagnosed with developmental disorders such as Attention Deficit Disorder (ADD), Attention Deficit Hyperactive Disorder (ADHD) and Autism Spectrum Disorder (ASD), need specialists and therapists to help hone and sharpen their basic social skills. The system utilizes an RGB-D sensor and a ro...
Saved in:
Main Authors: | , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Animo Repository
2014
|
Online Access: | https://animorepository.dlsu.edu.ph/etd_bachelors/11886 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | De La Salle University |
Language: | English |
Summary: | Children diagnosed with developmental disorders such as Attention Deficit Disorder (ADD), Attention Deficit Hyperactive Disorder (ADHD) and Autism Spectrum Disorder (ASD), need specialists and therapists to help hone and sharpen their basic social skills. The system utilizes an RGB-D sensor and a robot disguised as a plush toy. The robot executes routines while the sensor captures gesture movements of the child user. Through this research, children with ADD/ADHD or ASD can develop their social skills through play without inducing fear caused by human interaction. Observations will also be easier to conduct whenever it is needed to track and take note of the child's development.
The system features a six-vector HMM-based gesture recognition. It is capable of tracking one user at a time and recognizing the following seven gestures: Clap, Left and Right Hand Wave, Left and Right Hand Up, and Left and Right Lateral Raise. Using the RGB camera of the RGB-D sensor, the system can record the interaction between the robot and the user in colored video. Depth images and IR images can also be viewed for low light or night time monitoring. Interaction is executed by a robot that is distinguished as a plush toy with human-like movements. The robot is programmed with six action routines.
The performance of the Gesture Recognition submodule is tested on two sets of test subjects, composed of an adult and a five-year old child. Adult gesture recognition test achieved a 92.86% recognition rate and 94.13% from the child test. Correspondingly, the movements of the robot were made to appear natural by designing the robot routines based on the accelerometer data collected from test subjects. Comparison of the robot and human actions result to NRMSE values that rage from 0.04179 to 0.11651. Actions performed by the robot are close to human-like actions as the NRMSE values of the robot actions are almost zero which indicates the high resemblance of the robot actions with the human actions. |
---|