An automated dialogue management system for a museum robotic guide

Tour guiding is done in museums and many places of interest. The quality of tour guides is however, unable to be made uniform. The nature of the job can be repetitive and mundane. Therefore, the next step is to replace human tour guides with robots. This is to induce a uniform quality of robotic tou...

Full description

Saved in:
Bibliographic Details
Main Author: Yu, Eugene GuangQian
Other Authors: Seet Gim Lee, Gerald
Format: Final Year Project
Language:English
Published: 2015
Subjects:
Online Access:http://hdl.handle.net/10356/64920
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-64920
record_format dspace
spelling sg-ntu-dr.10356-649202023-03-04T19:34:04Z An automated dialogue management system for a museum robotic guide Yu, Eugene GuangQian Seet Gim Lee, Gerald School of Mechanical and Aerospace Engineering Robotics Research Centre DRNTU::Engineering::Mechanical engineering::Robots Tour guiding is done in museums and many places of interest. The quality of tour guides is however, unable to be made uniform. The nature of the job can be repetitive and mundane. Therefore, the next step is to replace human tour guides with robots. This is to induce a uniform quality of robotic tour guides; to remove the need for humans to work such jobs. Since year 1997, the first robotic tour guide, Rhino, was placed in a museum exhibition, robotic tour guides have shifted from a focus of map planning and localization. Currently, more focuses are on content generation, physical gestures and facial expressions. These changes enable users an engaging and interactive experience with the robots. In future, when artificial intelligence is added, a seamless and life-like interaction with a robot is expected. For this project, a museum robotic guide is made from an existing platform, MAVEN. In order to mimic a human tour guide, the fully autonomous robotic guide will give a basic tour by guiding users around the given floor area. The robot will educate the users on displayed items. At the end of the tour, it will conduct a question and answer (QnA) session. To get a fully autonomous robotic guide, we focused on Automated Docking System, Automated Navigation System and Automated Dialogue Management System (ADMS). The ADMS uses the VHToolKit from University of Southern California (USC) Institute of Creative Technologies (ICT). To use the VHToolKit as the ADMS, QnA database is created along with scripting for displayed items and a programming code that uses the Transmission Control Protocol/Internet Protocol (TCP/IP) to communicate. Hence, allowing information to be shared among the 3 FYP students. This makes it fully autonomous and function in synchronization. In order to establish an engaging and interactive experience, the robot will generate responses mated to hand gestures, facial expressions and voice which is lip-synced. Reviewing on a survey conducted, this project has established its project scope and goals. This includes, to allow robot to operate fully autonomous, to deliver scripts at precise locations and to receive satisfaction from the survey participants. In future, gesture sensing, human detection, better Automatic Speech Recognizer (ASR) and Text To Speech (TTS) should be implemented to make user experience more awe inspiring. Bachelor of Engineering (Mechanical Engineering) 2015-06-09T06:12:49Z 2015-06-09T06:12:49Z 2015 2015 Final Year Project (FYP) http://hdl.handle.net/10356/64920 en Nanyang Technological University 79 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering::Mechanical engineering::Robots
spellingShingle DRNTU::Engineering::Mechanical engineering::Robots
Yu, Eugene GuangQian
An automated dialogue management system for a museum robotic guide
description Tour guiding is done in museums and many places of interest. The quality of tour guides is however, unable to be made uniform. The nature of the job can be repetitive and mundane. Therefore, the next step is to replace human tour guides with robots. This is to induce a uniform quality of robotic tour guides; to remove the need for humans to work such jobs. Since year 1997, the first robotic tour guide, Rhino, was placed in a museum exhibition, robotic tour guides have shifted from a focus of map planning and localization. Currently, more focuses are on content generation, physical gestures and facial expressions. These changes enable users an engaging and interactive experience with the robots. In future, when artificial intelligence is added, a seamless and life-like interaction with a robot is expected. For this project, a museum robotic guide is made from an existing platform, MAVEN. In order to mimic a human tour guide, the fully autonomous robotic guide will give a basic tour by guiding users around the given floor area. The robot will educate the users on displayed items. At the end of the tour, it will conduct a question and answer (QnA) session. To get a fully autonomous robotic guide, we focused on Automated Docking System, Automated Navigation System and Automated Dialogue Management System (ADMS). The ADMS uses the VHToolKit from University of Southern California (USC) Institute of Creative Technologies (ICT). To use the VHToolKit as the ADMS, QnA database is created along with scripting for displayed items and a programming code that uses the Transmission Control Protocol/Internet Protocol (TCP/IP) to communicate. Hence, allowing information to be shared among the 3 FYP students. This makes it fully autonomous and function in synchronization. In order to establish an engaging and interactive experience, the robot will generate responses mated to hand gestures, facial expressions and voice which is lip-synced. Reviewing on a survey conducted, this project has established its project scope and goals. This includes, to allow robot to operate fully autonomous, to deliver scripts at precise locations and to receive satisfaction from the survey participants. In future, gesture sensing, human detection, better Automatic Speech Recognizer (ASR) and Text To Speech (TTS) should be implemented to make user experience more awe inspiring.
author2 Seet Gim Lee, Gerald
author_facet Seet Gim Lee, Gerald
Yu, Eugene GuangQian
format Final Year Project
author Yu, Eugene GuangQian
author_sort Yu, Eugene GuangQian
title An automated dialogue management system for a museum robotic guide
title_short An automated dialogue management system for a museum robotic guide
title_full An automated dialogue management system for a museum robotic guide
title_fullStr An automated dialogue management system for a museum robotic guide
title_full_unstemmed An automated dialogue management system for a museum robotic guide
title_sort automated dialogue management system for a museum robotic guide
publishDate 2015
url http://hdl.handle.net/10356/64920
_version_ 1759857545870049280