Using touch sensors to adapt skewering approach of robot arm for assistive feeding purposes

Feeding is an activity of daily living (ADL) that many struggle to perform independently. Thus, there has been increased research into assistive feeding using robotic arms in the recent past. In works such as Sundaresan et al. [1], a robotic arm is used in conjunction with a vision sensor and for...

Full description

Saved in:
Bibliographic Details
Main Author: Shrivastava, Samruddhi
Other Authors: Ang Wei Tech
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/168699
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Feeding is an activity of daily living (ADL) that many struggle to perform independently. Thus, there has been increased research into assistive feeding using robotic arms in the recent past. In works such as Sundaresan et al. [1], a robotic arm is used in conjunction with a vision sensor and force-torque sensors to generate fork skewering strategies for various foods. However, force-torque sensors are expensive and have a lengthy and complicated fabrication process. In this work, the classifier algorithm created by Sundaresan et al. [1], HapticVisualNet, is evaluated, using touch sensors instead of force-torque sensors. This is because touch sensors are significantly cheaper and easier to manufacture than force-torque sensors. The touch sensors were created by researchers at the Leong Research Group (Soft Electronics Lab) at NTU. First, these touch sensors are integrated into the hardware of the circuit and robotic system. Their performance is then evaluated, and it is observed that they can distinguish between soft food, such as bananas, and hard foods, such as apples. A food-skewering touch sensor dataset is created to train HapticVisualNet. This strategy was successful in achieving comparable accuracy to force-torque sensors when used with touch sensors in real-time food experimentation. Thus, this is a feasible system that is more suited to an assisted living context. Some limitations of this approach are also discussed along with suggestions for future improvements.