Gesture-controlled semi-autonomous robotic arm with AHRS-based orientation tracking and real-time object detection
This report presents the development of a semi-autonomous robotic arm system designed for dynamic environments, integrating gesture control, object detection, and robotic manipulation. The system employed an Adafruit Feather nRF52840 Sense microcontroller equipped with a 9- Degrees of Freedom inerti...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/181500 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-181500 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1815002024-12-05T07:43:16Z Gesture-controlled semi-autonomous robotic arm with AHRS-based orientation tracking and real-time object detection Loh, Aloysius Sijing Oh Hong Lye College of Computing and Data Science hloh@ntu.edu.sg Computer and Information Science This report presents the development of a semi-autonomous robotic arm system designed for dynamic environments, integrating gesture control, object detection, and robotic manipulation. The system employed an Adafruit Feather nRF52840 Sense microcontroller equipped with a 9- Degrees of Freedom inertial measurement unit, running an altitude and heading reference system (AHRS) algorithm to measure orientation for precise arm movement control. The robotic arm’s operations were further enhanced by a TensorFlow Lite gesture recognition model, optimised for low-latency inference on the microcontroller, and a YOLOv4-Tiny image recognition model running on a Raspberry Pi 4 for real-time object detection. Communication between components was achieved through Bluetooth Low Energy (BLE), ensuring seamless interaction and efficient data transfer. Multithreading on the Raspberry Pi ensured smooth operation by handling BLE communication, camera input, and object detection concurrently. This distributed architecture allowed the robotic arm to respond to user input while performing object detection in real time. This project demonstrates the potential of combining AHRS-based orientation tracking, gesture recognition, and object detection to enhance human-robot interaction in research, manufacturing, and hazardous environments. Bachelor's degree 2024-12-05T07:43:15Z 2024-12-05T07:43:15Z 2024 Final Year Project (FYP) Loh, A. S. (2024). Gesture-controlled semi-autonomous robotic arm with AHRS-based orientation tracking and real-time object detection. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/181500 https://hdl.handle.net/10356/181500 en SCSE23-0910 application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Computer and Information Science |
spellingShingle |
Computer and Information Science Loh, Aloysius Sijing Gesture-controlled semi-autonomous robotic arm with AHRS-based orientation tracking and real-time object detection |
description |
This report presents the development of a semi-autonomous robotic arm system designed for dynamic environments, integrating gesture control, object detection, and robotic manipulation. The system employed an Adafruit Feather nRF52840 Sense microcontroller equipped with a 9- Degrees of Freedom inertial measurement unit, running an altitude and heading reference system (AHRS) algorithm to measure orientation for precise arm movement control. The robotic arm’s operations were further enhanced by a TensorFlow Lite gesture recognition model, optimised for low-latency inference on the microcontroller, and a YOLOv4-Tiny image recognition model running on a Raspberry Pi 4 for real-time object detection. Communication between components was achieved through Bluetooth Low Energy (BLE), ensuring seamless interaction and efficient data transfer. Multithreading on the Raspberry Pi ensured smooth operation by handling BLE communication, camera input, and object detection concurrently. This distributed architecture allowed the robotic arm to respond to user input while performing object detection in real time. This project demonstrates the potential of combining AHRS-based orientation tracking, gesture recognition, and object detection to enhance human-robot interaction in research, manufacturing, and hazardous environments. |
author2 |
Oh Hong Lye |
author_facet |
Oh Hong Lye Loh, Aloysius Sijing |
format |
Final Year Project |
author |
Loh, Aloysius Sijing |
author_sort |
Loh, Aloysius Sijing |
title |
Gesture-controlled semi-autonomous robotic arm with AHRS-based orientation tracking and real-time object detection |
title_short |
Gesture-controlled semi-autonomous robotic arm with AHRS-based orientation tracking and real-time object detection |
title_full |
Gesture-controlled semi-autonomous robotic arm with AHRS-based orientation tracking and real-time object detection |
title_fullStr |
Gesture-controlled semi-autonomous robotic arm with AHRS-based orientation tracking and real-time object detection |
title_full_unstemmed |
Gesture-controlled semi-autonomous robotic arm with AHRS-based orientation tracking and real-time object detection |
title_sort |
gesture-controlled semi-autonomous robotic arm with ahrs-based orientation tracking and real-time object detection |
publisher |
Nanyang Technological University |
publishDate |
2024 |
url |
https://hdl.handle.net/10356/181500 |
_version_ |
1819113037756366848 |