The augmented human : seeing sounds
Human ears can hear sound from range frequency of 20Hz to 20kHz. They discerns between ringing of a telephone to the sound of lion’s roar. It is also capable of tracking the sounds, providing the person a rough estimation where the direction is and to confirm it source, the person only needs to view...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/77079 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-77079 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-770792023-03-03T20:50:18Z The augmented human : seeing sounds Nurazhar Maarof Cham Tat Jen School of Computer Science and Engineering DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision Human ears can hear sound from range frequency of 20Hz to 20kHz. They discerns between ringing of a telephone to the sound of lion’s roar. It is also capable of tracking the sounds, providing the person a rough estimation where the direction is and to confirm it source, the person only needs to view it direction. However being impaired in hearing would disabled the person from such ability. In this project, the possibility of means to using readily available technology as a substitute to sourcing the direction of sounds through merging the signal and image processing. Using portable devices such Raspberry Pi as the computation power, camera module to stream the images and MATRIX Creator for its embedded microphones to capture the audio. With testing and implementation of the system using 2 methods to merge merging the signal and image processing, using Normalized Device Coordinate (NDC) provides a better result which able to highlight any given sound and track its movement. Bachelor of Engineering (Computer Engineering) 2019-05-06T07:31:58Z 2019-05-06T07:31:58Z 2019 Final Year Project (FYP) http://hdl.handle.net/10356/77079 en Nanyang Technological University 45 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision |
spellingShingle |
DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision Nurazhar Maarof The augmented human : seeing sounds |
description |
Human ears can hear sound from range frequency of 20Hz to 20kHz. They discerns between ringing of a telephone to the sound of lion’s roar. It is also capable of tracking the sounds, providing the person a rough estimation where the direction is and to confirm it source, the person only needs to view it direction. However being impaired in hearing would disabled the person from such ability.
In this project, the possibility of means to using readily available technology as a substitute to sourcing the direction of sounds through merging the signal and image processing. Using portable devices such Raspberry Pi as the computation power, camera module to stream the images and MATRIX Creator for its embedded microphones to capture the audio.
With testing and implementation of the system using 2 methods to merge merging the signal and image processing, using Normalized Device Coordinate (NDC) provides a better result which able to highlight any given sound and track its movement. |
author2 |
Cham Tat Jen |
author_facet |
Cham Tat Jen Nurazhar Maarof |
format |
Final Year Project |
author |
Nurazhar Maarof |
author_sort |
Nurazhar Maarof |
title |
The augmented human : seeing sounds |
title_short |
The augmented human : seeing sounds |
title_full |
The augmented human : seeing sounds |
title_fullStr |
The augmented human : seeing sounds |
title_full_unstemmed |
The augmented human : seeing sounds |
title_sort |
augmented human : seeing sounds |
publishDate |
2019 |
url |
http://hdl.handle.net/10356/77079 |
_version_ |
1759857244404449280 |