Sign language recognition using You Only Look Once-Neural Architecture Search / Muhammad Imran Nasharudin

The American Sign Language (ASL) is a nonverbal communication language that uses visual sign patterns formed with the hands or any part of the body and is usually utilized by persons who have hearing or listening disabilities. The deaf and mute people have difficulty communicating their thoughts, ne...

Full description

Saved in:
Bibliographic Details
Main Author: Nasharudin, Muhammad Imran
Format: Thesis
Language:English
Published: 2023
Subjects:
Online Access:https://ir.uitm.edu.my/id/eprint/89043/1/89043.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-uitm-ir.89043
record_format uketd_dc
spelling my-uitm-ir.890432024-03-19T07:08:33Z Sign language recognition using You Only Look Once-Neural Architecture Search / Muhammad Imran Nasharudin 2023 Nasharudin, Muhammad Imran Applied psychology The American Sign Language (ASL) is a nonverbal communication language that uses visual sign patterns formed with the hands or any part of the body and is usually utilized by persons who have hearing or listening disabilities. The deaf and mute people have difficulty communicating their thoughts, needs, and feelings through spoken language. There is a need to have an alternative method based on computer technology for those who are deafen or hard of hearing people since vocal communication is not available to them. One of the issues of computer-based sign language recognition is latency which creates delay in executing the interpretation of certain gestures. To balance latency vs. throughput, the architecture is discovered automatically using a Neural Architecture Search (NAS) technology called AutoNAC. The innovative features of YOLO-NAS include the quantization aware modules QSP and QCI, which combine re-parameterization for 8-bit quantization to minimize accuracy loss during post-training quantization. The architecture is designed to identify tiny objects, increase localization accuracy, and improve the performance-per-compute ratio, making it appropriate for real-time edge-device applications. As using YOLO-NAS for the sign language recognition, we succeeded to deploy average of detection percentage of 86% of all sign language alphabets. YOLO-NAS networks are successfully used in sign language recognition, with a reported 96.41 (mAP@50). 2023 Thesis https://ir.uitm.edu.my/id/eprint/89043/ https://ir.uitm.edu.my/id/eprint/89043/1/89043.pdf text en public degree Universiti Teknologi MARA, Melaka College of Computing, Informatics and Mathematics
institution Universiti Teknologi MARA
collection UiTM Institutional Repository
language English
topic Applied psychology
spellingShingle Applied psychology
Nasharudin, Muhammad Imran
Sign language recognition using You Only Look Once-Neural Architecture Search / Muhammad Imran Nasharudin
description The American Sign Language (ASL) is a nonverbal communication language that uses visual sign patterns formed with the hands or any part of the body and is usually utilized by persons who have hearing or listening disabilities. The deaf and mute people have difficulty communicating their thoughts, needs, and feelings through spoken language. There is a need to have an alternative method based on computer technology for those who are deafen or hard of hearing people since vocal communication is not available to them. One of the issues of computer-based sign language recognition is latency which creates delay in executing the interpretation of certain gestures. To balance latency vs. throughput, the architecture is discovered automatically using a Neural Architecture Search (NAS) technology called AutoNAC. The innovative features of YOLO-NAS include the quantization aware modules QSP and QCI, which combine re-parameterization for 8-bit quantization to minimize accuracy loss during post-training quantization. The architecture is designed to identify tiny objects, increase localization accuracy, and improve the performance-per-compute ratio, making it appropriate for real-time edge-device applications. As using YOLO-NAS for the sign language recognition, we succeeded to deploy average of detection percentage of 86% of all sign language alphabets. YOLO-NAS networks are successfully used in sign language recognition, with a reported 96.41 (mAP@50).
format Thesis
qualification_level Bachelor degree
author Nasharudin, Muhammad Imran
author_facet Nasharudin, Muhammad Imran
author_sort Nasharudin, Muhammad Imran
title Sign language recognition using You Only Look Once-Neural Architecture Search / Muhammad Imran Nasharudin
title_short Sign language recognition using You Only Look Once-Neural Architecture Search / Muhammad Imran Nasharudin
title_full Sign language recognition using You Only Look Once-Neural Architecture Search / Muhammad Imran Nasharudin
title_fullStr Sign language recognition using You Only Look Once-Neural Architecture Search / Muhammad Imran Nasharudin
title_full_unstemmed Sign language recognition using You Only Look Once-Neural Architecture Search / Muhammad Imran Nasharudin
title_sort sign language recognition using you only look once-neural architecture search / muhammad imran nasharudin
granting_institution Universiti Teknologi MARA, Melaka
granting_department College of Computing, Informatics and Mathematics
publishDate 2023
url https://ir.uitm.edu.my/id/eprint/89043/1/89043.pdf
_version_ 1794192192672104448