Android based application for visually impaired using deep learning approach
People with visually impaired had difficulties in doing activities related to environment, social and technology. Furthermore, they are having issues with independent and safe in their daily routine. This research propose deep learning based visual object recognition model to help the visually impai...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Institute of Advanced Engineering and Science
2021
|
Online Access: | http://eprints.utem.edu.my/id/eprint/25760/2/2021_ANDROID%20BASED%20APPLICATION%20FOR%20VISUALLY%20IMPAIRED%20USING%20DEEP%20LEARNING%20APPROACH.PDF http://eprints.utem.edu.my/id/eprint/25760/ https://ijai.iaescore.com/index.php/IJAI/article/view/20741/13267 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Teknikal Malaysia Melaka |
Language: | English |
Summary: | People with visually impaired had difficulties in doing activities related to environment, social and technology. Furthermore, they are having issues with independent and safe in their daily routine. This research propose deep learning based visual object recognition model to help the visually impaired people in their daily basis using the android application platform. This research is mainly focused on the recognition of the money, cloth and other basic things to make their life easier. The convolution neural network (CNN) based visual recognition model by TensorFlow object application programming interface (API) that used single shot detector (SSD) with a pre-trained model from Mobile V2 is developed at Google dataset. Visually impaired persons capture the image and will be compared with the preloaded image dataset for dataset recognition. The verbal message with the name of the image will let the blind used know the captured image. The object recognition achieved high accuracy and can be used without using internet connection. The visually impaired specifically are largely benefited by this research. |
---|