A voice activated bi-articular exosuit for upper limb assistance during lifting tasks
Humans are favoured to conventional robotics for some tasks in industry due to their increased dexterity and fine motor skills, however, performance of these tasks can result in injury to the user at a cost to both the user and the employer. In this paper we describe a lightweight, upper-limb exosui...
Saved in:
Main Authors: | , , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/159638 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Humans are favoured to conventional robotics for some tasks in industry due to their increased dexterity and fine motor skills, however, performance of these tasks can result in injury to the user at a cost to both the user and the employer. In this paper we describe a lightweight, upper-limb exosuit intended to assist the user during lifting tasks (up to 10kg) and while operating power tools, which are common activities for industrial workers. The exosuit assists elbow and shoulder flexion for both arms and allows for passive movements in the transverse plane. To achieve the design criteria an underactuated mechanism has been developed, where a single motor is used to assist two degrees of freedom per arm. In the intended application, the hands are generally busy and cannot be used to provide inputs to the robot, therefore, a voice-activated control has been developed that allows the user to give voice commands to operate the exosuit. Experiments were performed on 5 healthy subjects to assess the change in Muscular Activation (MA), inferred through Electromyography (EMG) signals, during three tasks: i) lifting and releasing a load; ii) holding a position and iii) manipulating a tool. The results showed that the exosuit is capable of reducing EMG activity (between 24.6% and 64.6%) and the recognition rate (94.8%) of the voice recognition module was evaluated. |
---|