Development of gesture-controlled robotic arm for upper limb hemiplegia therapy

Human-computer interactions using hand gesture recognition has emerge as a current approach in recent rehabilitation studies. The introduction of a vision-based system such as the Microsoft Kinect and the Leap Motion sensor (LMS) provides a very informative description of hand pose that can be explo...

Full description

Saved in:
Bibliographic Details
Main Author: Wan Azlan, Wan Norliyana
Format: Thesis
Language:English
English
English
Published: 2022
Subjects:
Online Access:http://eprints.uthm.edu.my/8366/1/24p%20WAN%20NORLIYANA%20WAN%20AZLAN.pdf
http://eprints.uthm.edu.my/8366/2/WAN%20NORLIYANA%20WAN%20AZLAN%20COPYRIGHT%20DECLARATION.pdf
http://eprints.uthm.edu.my/8366/3/WAN%20NORLIYANA%20WAN%20AZLAN%20WATERMARK.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Human-computer interactions using hand gesture recognition has emerge as a current approach in recent rehabilitation studies. The introduction of a vision-based system such as the Microsoft Kinect and the Leap Motion sensor (LMS) provides a very informative description of hand pose that can be exploited for tracking applications. Compared to the Kinect depth camera, the LMS produces a more limited amount of information and interaction zone, but the output data is more accurate. Thus, this study aims to explore the LMS system as an effective method for hand gesture recognition controlled robotic arm in improving upper-extremity motor function therapy. Many engineering challenges are addressed to develop a viable system for the therapy application: a real-time and accurate system for hand movement detection, limitation of robot workspace and hand-robot coordination, and development of hand motion-based robot position algorithm. EMU HS4 robot arm and controller have been retrofitted to allow 3 degrees of freedom (DOF) moment and directly controlled by LMS-based gesture recognition. A series of wrist revolving rehabilitation exercises are conducted that provides a good agreement where the robot can move according to hand movement. The potential of the proposed system has been further illustrated and verified through comprehensive rehabilitation training exercises with around 90% accuracy for flexion-extension training. In conclusion, these findings have significant implications for the understanding of hand recognition application towards robotic-based upper limb assistive and rehabilitation procedures.