Detection of human-object interaction in the meeting room
With the improvement of computer capabilities and fast development of AI technology, our lives are gradually changing and improving, and more novel goals are being achieved. For example, face detection and recognition help the goal ‘Smart Workspace’ come true by using a simple, efficient and accurat...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/139873 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-139873 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1398732023-07-07T18:33:25Z Detection of human-object interaction in the meeting room Wang, Taige Tan Yap Peng School of Electrical and Electronic Engineering EYPTan@ntu.edu.sg Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision Engineering::Electrical and electronic engineering With the improvement of computer capabilities and fast development of AI technology, our lives are gradually changing and improving, and more novel goals are being achieved. For example, face detection and recognition help the goal ‘Smart Workspace’ come true by using a simple, efficient and accurate method of checking in and out to replace the old method which has a lot of loopholes; Object detection and recognition help the goal ‘Smart Classroom’ come true by recording and recognizing students’ behavior in real time. Recently, the goal of ‘Smart Meeting room’ has been proposed to help people have a better record and analysis of the meeting. Some tasks need to be done like to know how human and human communicate, what people do during the meeting and how they interact with the surroundings. To better understand and record what people are doing and interacting with objects around, this project applies one of the latest research topics in computer vision – ‘Human-Object Interaction Detection’ into the scenario of a meeting room. In this project, one proper HOI detection method named ‘iCAN’ has been chosen among the state-of-art in this topic to be the baseline after some study. iCAN was trained and evaluated on the V-COCO and HICO-DET datasets. After a demo of iCAN on the example pictures of a meeting, some common problems were found. Then a small evaluation dataset about the meeting room was set up to evaluate the performance of iCAN in the scenario of meeting rooms. By taking the evaluation dataset as the standard, the original method has been modified to solve the common problems which occurs previously. At last, some automation work has been done to make the whole process quick and convenient. Bachelor of Engineering (Electrical and Electronic Engineering) 2020-05-22T05:53:08Z 2020-05-22T05:53:08Z 2020 Final Year Project (FYP) https://hdl.handle.net/10356/139873 en A3291-191 application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision Engineering::Electrical and electronic engineering |
spellingShingle |
Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision Engineering::Electrical and electronic engineering Wang, Taige Detection of human-object interaction in the meeting room |
description |
With the improvement of computer capabilities and fast development of AI technology, our lives are gradually changing and improving, and more novel goals are being achieved. For example, face detection and recognition help the goal ‘Smart Workspace’ come true by using a simple, efficient and accurate method of checking in and out to replace the old method which has a lot of loopholes; Object detection and recognition help the goal ‘Smart Classroom’ come true by recording and recognizing students’ behavior in real time. Recently, the goal of ‘Smart Meeting room’ has been proposed to help people have a better record and analysis of the meeting. Some tasks need to be done like to know how human and human communicate, what people do during the meeting and how they interact with the surroundings. To better understand and record what people are doing and interacting with objects around, this project applies one of the latest research topics in computer vision – ‘Human-Object Interaction Detection’ into the scenario of a meeting room. In this project, one proper HOI detection method named ‘iCAN’ has been chosen among the state-of-art in this topic to be the baseline after some study. iCAN was trained and evaluated on the V-COCO and HICO-DET datasets. After a demo of iCAN on the example pictures of a meeting, some common problems were found. Then a small evaluation dataset about the meeting room was set up to evaluate the performance of iCAN in the scenario of meeting rooms. By taking the evaluation dataset as the standard, the original method has been modified to solve the common problems which occurs previously. At last, some automation work has been done to make the whole process quick and convenient. |
author2 |
Tan Yap Peng |
author_facet |
Tan Yap Peng Wang, Taige |
format |
Final Year Project |
author |
Wang, Taige |
author_sort |
Wang, Taige |
title |
Detection of human-object interaction in the meeting room |
title_short |
Detection of human-object interaction in the meeting room |
title_full |
Detection of human-object interaction in the meeting room |
title_fullStr |
Detection of human-object interaction in the meeting room |
title_full_unstemmed |
Detection of human-object interaction in the meeting room |
title_sort |
detection of human-object interaction in the meeting room |
publisher |
Nanyang Technological University |
publishDate |
2020 |
url |
https://hdl.handle.net/10356/139873 |
_version_ |
1772827881327558656 |