Malaysian sign language recognition framework based on sensory glove

The purpose of this study was to propose a low-cost and real-time recognition system using asensory glove, which has 17 sensors with 65 channels to capture static sign data of the Malaysiansign language (MSL). The study uses an experimental design. Five participants well-known MSLwere chosen to perf...

Full description

Saved in:
Bibliographic Details
Main Author: Altaha, Mohamed Aktham Ahmed
Format: thesis
Language:eng
Published: 2019
Subjects:
Online Access:https://ir.upsi.edu.my/detailsg.php?det=5224
Tags: Add Tag
No Tags, Be the first to tag this record!
id oai:ir.upsi.edu.my:5224
record_format uketd_dc
institution Universiti Pendidikan Sultan Idris
collection UPSI Digital Repository
language eng
topic QA Mathematics
spellingShingle QA Mathematics
Altaha, Mohamed Aktham Ahmed
Malaysian sign language recognition framework based on sensory glove
description The purpose of this study was to propose a low-cost and real-time recognition system using asensory glove, which has 17 sensors with 65 channels to capture static sign data of the Malaysiansign language (MSL). The study uses an experimental design. Five participants well-known MSLwere chosen to perform 75 gestures throughout wear sensory glove. This research was carriedout in six phases as follows: Phase I involved a review of literature via a systematic reviewapproach to identify the relevant set of articles that helped formulate the research questions.Phase II focused on the analysis of hand anatomy, hand kinematic, and hand gestures to helpunderstand the nature of MSL and to define the glove requirements. In Phase III, DataGlove wasdesigned and developed based on the glove requirements to help optimize the best functions of theglove. Phase IV involved the pre-processing, feature extraction, and classification of the datacollected from the proposed DataGlove and identified gestures of MSL. A new vision and sensor-basedMSL datasets were collected in Phase V. Phase VI focused on the evaluation and validation processacross different development stages. The error rate was used to check system performance. Also, a3D printed humanoid arm was used to validate the sensor mounted on the glove. The results of dataanalysis showed 37 common patterns with similar hand gestures in MSL. Furthermore, thedesign of DataGlove based on MSL analysis was effective in capturing a wide range of gestures witha recognition accuracy of 99%, 96%, and 93.4% for numbers, alphabet letters, and words,respectively. In conclusion, the research findings suggest that 37 group's gestures of MSLcan increase the recognition accuracy of MSL hand gestures to bridge the gap between peoplewith hearing impairments and ordinary people. For future research, a more comprehensive analysis of the MSL recognition system isrecommended.
format thesis
qualification_name
qualification_level Doctorate
author Altaha, Mohamed Aktham Ahmed
author_facet Altaha, Mohamed Aktham Ahmed
author_sort Altaha, Mohamed Aktham Ahmed
title Malaysian sign language recognition framework based on sensory glove
title_short Malaysian sign language recognition framework based on sensory glove
title_full Malaysian sign language recognition framework based on sensory glove
title_fullStr Malaysian sign language recognition framework based on sensory glove
title_full_unstemmed Malaysian sign language recognition framework based on sensory glove
title_sort malaysian sign language recognition framework based on sensory glove
granting_institution Universiti Pendidikan Sultan Idris
granting_department Fakulti Seni, Komputeran dan Industri Kreatif
publishDate 2019
url https://ir.upsi.edu.my/detailsg.php?det=5224
_version_ 1747833173054586880
spelling oai:ir.upsi.edu.my:52242020-09-09 Malaysian sign language recognition framework based on sensory glove 2019 Altaha, Mohamed Aktham Ahmed QA Mathematics The purpose of this study was to propose a low-cost and real-time recognition system using asensory glove, which has 17 sensors with 65 channels to capture static sign data of the Malaysiansign language (MSL). The study uses an experimental design. Five participants well-known MSLwere chosen to perform 75 gestures throughout wear sensory glove. This research was carriedout in six phases as follows: Phase I involved a review of literature via a systematic reviewapproach to identify the relevant set of articles that helped formulate the research questions.Phase II focused on the analysis of hand anatomy, hand kinematic, and hand gestures to helpunderstand the nature of MSL and to define the glove requirements. In Phase III, DataGlove wasdesigned and developed based on the glove requirements to help optimize the best functions of theglove. Phase IV involved the pre-processing, feature extraction, and classification of the datacollected from the proposed DataGlove and identified gestures of MSL. A new vision and sensor-basedMSL datasets were collected in Phase V. Phase VI focused on the evaluation and validation processacross different development stages. The error rate was used to check system performance. Also, a3D printed humanoid arm was used to validate the sensor mounted on the glove. The results of dataanalysis showed 37 common patterns with similar hand gestures in MSL. Furthermore, thedesign of DataGlove based on MSL analysis was effective in capturing a wide range of gestures witha recognition accuracy of 99%, 96%, and 93.4% for numbers, alphabet letters, and words,respectively. In conclusion, the research findings suggest that 37 group's gestures of MSLcan increase the recognition accuracy of MSL hand gestures to bridge the gap between peoplewith hearing impairments and ordinary people. For future research, a more comprehensive analysis of the MSL recognition system isrecommended. 2019 thesis https://ir.upsi.edu.my/detailsg.php?det=5224 https://ir.upsi.edu.my/detailsg.php?det=5224 text eng closedAccess Doctoral Universiti Pendidikan Sultan Idris Fakulti Seni, Komputeran dan Industri Kreatif Abdulla, D., Abdulla, S., Manaf, R., & Jarndal, A. H. (2016). Design andimplementation of a sign-to-speech/text system for deaf and dumb people. Paper presented at theElectronic Devices, Systems and Applications (ICEDSA), 2016 5th International Conference on.Abdulnabi, M., Al-Haiqi, A., Kiah, M. L. M., Zaidan, A., Zaidan, B., & Hussain, M. (2017). A distributed framework for health information exchange using smartphone technologies.Journal of biomedical informatics, 69, 230-250.Abhishek, K. S., Qubeley, L. C. F., & Ho, D. (2016). Glove-based hand gesturerecognition sign language translator using capacitive touch sensor. Paper presented at theElectron Devices and Solid-State Circuits (EDSSC), 2016 IEEE International Conference on.Abualola, H., Al Ghothani, H., Eddin, A. N., Almoosa, N., & Poon, K. (2016). Flexible gesturerecognition using wearable inertial sensors. Paper presented at the Circuits and Systems(MWSCAS), 2016 IEEE 59th International Midwest Symposium on.Adnan, N. H., Wan, K., Shahriman, A., Zaaba, S., nisha Basah, S., Razlan, Z. M., . . . Aziz, A. A.(2012). Measurement of the flexible bending force of the index and middle fingers for virtualinteraction. Procedia engineering, 41, 388-394.Aguiar, S., Erazo, A., Romero, S., Garcs, E., Atiencia, V., & Figueroa, J. P. (2016). Developmentof a smart glove as a communication tool for people with hearing impairment and speechdisorders. Paper presented at the Ecuador Technical Chapters Meeting (ETCM), IEEE.Ahmed, S., Islam, R., Zishan, M. S. R., Hasan, M. R., & Islam, M. N. (2015). Electronic speakingsystem for speech impaired people: Speak up. Paper presented at the Electrical Engineering and Information Communication Technology (ICEEICT), 2015 International Conference on.Ahmed, S. F., Ali, S. M. B., & Qureshi, S. S. M. (2010). Electronic speaking glove for speechlesspatients, a tongue to a dumb. Paper presented at the Sustainable Utilization andDevelopment in Engineering and Technology (STUDENT),2010 IEEE Conference on.Al-Ahdal, M. E., & Nooritawati, M. T. (2012). Review in sign language recognition systems. Paperpresented at the Computers & Informatics (ISCI), 2012 IEEE Symposium on.Alaa, M., Zaidan, A., Zaidan, B., Talal, M., & Kiah, M. (2017). A review of smart home applicationsbased on Internet of Things. Journal of Network and Computer Applications, 97, 48-65.Alvi, A. K., Azhar, M. Y. B., Usman, M., Mumtaz, S., Rafiq, S., Rehman, R. U., & Ahmed, I. (2004).Pakistan sign language recognition using statistical template matching. International Journal ofInformation Technology, 1(1), 1-12.Anderson, R., Wiryana, F., Ariesta, M. C., & Kusuma, G. P. (2017). Sign Language RecognitionApplication Systems for Deaf-Mute People: A Review Based on Input-Process-Output. Procedia ComputerScience, 116, 441-448.Ani, A. I. C., Rosli, A. D., Baharudin, R., Abbas, M. H., & Abdullah, M. F. (2014). Preliminarystudy of recognizing alphabet letter via hand gesture. Paper presented at theComputational Science and Technology (ICCST), 2014 International Conference on.Anupreethi, H., & Vijayakumar, S. (2012). MSP430 based sign language recognizer for dumb patients.Procedia engineering, 38, 1374-1380.Arif, A., Rizvi, S. T. H., Jawaid, I., Waleed, M. A., & Shakeel, M. R. (2016). Techno- Talk: AnAmerican Sign Language (ASL) Translator. Paper presented at the Control, Decision andInformation Technologies (CoDIT), 2016 International Conference on.Bajpai, D., Porov, U., Srivastav, G., & Sachan, N. (2015). Two Way Wireless DataCommunication and American Sign Language Translator Glove for Images Text and Speech Display on Mobile Phone. Paper presented at the Communication Systems and NetworkTechnologies (CSNT), 2015 Fifth International Conference on.Bedregal, B. R. C., & Dimuro, G. P. (2006). Interval fuzzy rule-based hand gesture recognition.Paper presented at the Scientific Computing, Computer Arithmetic and Validated Numerics, 2006. SCAN 2006. 12th GAMM-IMACS International Symposium on.Bhatnagar, V. S., Magon, R., Srivastava, R., & Thakur, M. K. (2015). A cost effective Sign Languageto voice emulation system. Paper presented at the ContemporaryComputing (IC3), 2015 Eighth International Conference on.Borghetti, M., Sardini, E., & Serpelloni, M. (2013). Sensorized glove for measuring hand fingerflexion for rehabilitation purposes. IEEE Transactions on Instrumentation andMeasurement, 62(12), 3308-3314.Buczek, F. L., Sinsel, E. W., Gloekler, D. S., Wimer, B. M., Warren, C. M., & Wu, J.Z. (2011). Kinematic performance of a six degree-of-freedom hand model (6DHand) for usein occupational biomechanics. Journal of biomechanics, 44(9), 1805-1809.Bui, T. D., & Nguyen, L. T. (2007). Recognizing postures in Vietnamese sign language with MEMSaccelerometers. IEEE Sensors Journal, 7(5), 707-712.Bullock, I. M., Borrs, J., & Dollar, A. M. (2012). Assessing assumptions in kinematic handmodels: a review. Paper presented at the Biomedical Robotics and Biomechatronics(BioRob), 2012 4th IEEE RAS & EMBS International Conference on.Cambridge, U. (2018a). Meaning of evaluation in the English Dictionary. Retrieved 17-10, 2018,fromhttps://dictionary.cambridge.org/dictionary/english/evaluationCambridge, U. (2018b). Meaning of validate in the English Dictionary. Retrieved 17- 10, 2018,from https://dictionary.cambridge.org/dictionary/english/validateChen, Y.-P. P., Johnson, C., Lalbakhsh, P., Caelli, T., Deng, G., Tay, D., . . . Doube,W. (2016). Systematic review of virtual speech therapists for speech disorders.Computer Speech & Language, 37, 98-128.Chouhan, T., Panse, A., Voona, A. K., & Sameer, S. (2014). Smart glove with gesture recognitionability for the hearing and speech impaired. Paper presented at the Global HumanitarianTechnology Conference-South Asia Satellite (GHTC- SAS), 2014 IEEE.Das, P., De, R., Paul, S., Chowdhury, M., & Neogi, B. (2015). Analytical study and overview onglove based Indian Sign Language interpretation technique.Dipietro, L., Sabatini, A. M., & Dario, P. (2008). A survey of glove-based systems and theirapplications. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications andReviews), 38(4), 461-482.El Hayek, H., Nacouzi, J., Kassem, A., Hamad, M., & El-Murr, S. (2014). Sign to letter translatorsystem using a hand glove. Paper presented at the e-Technologies andNetworks for Development (ICeND), 2014 Third International Conference on.Elmahgiubi, M., Ennajar, M., Drawil, N., & Elbuni, M. S. (2015). Sign languagetranslator and gesture recognition. Paper presented at the Computer & InformationTechnology (GSCIT), 2015 Global Summit on.Erol, A., Bebis, G., Nicolescu, M., Boyle, R. D., & Twombly, X. (2007). Vision-based hand poseestimation: A review. Computer Vision and Image Understanding, 108(1-2), 52-73.Fu, Y.-F., & Ho, C.-S. (2007). Static finger language recognition for handicappedaphasiacs. Paper presented at the Innovative Computing, Information and Control, 2007.ICICIC'07. Second International Conference on.Fu, Y.-F., & Ho, C.-S. (2008). Development of a programmable digital glove. Smart Materials andStructures, 17(2), 025031.Ga?ka, J., M?sior, M., Zaborski, M., & Barczewska, K. (2016). Inertial motion sensing glove forsign language gesture acquisition and recognition. IEEE Sensors Journal, 16(16), 6310-6316.Gupta, D., Singh, P., Pandey, K., & Solanki, J. (2015). Design and development of a low costElectronic Hand Glove for deaf and blind. Paper presented at the Computing forSustainable Global Development (INDIACom), 2015 2nd International Conference on.Harish, N., & Poonguzhali, S. (2015). Design and development of hand gesture recognitionsystem for speech impaired people. Paper presented at the Industrial Instrumentation and Control(ICIC), 2015 International Conference on.HOCK, O. S. (2007). A Review on the teaching and learning resources for the deaf community inMalaysia. Chiang Mai University Journal of Social Sciences and Humanities.Hoque, M. T., Rifat-Ut-Tauwab, M., Kabir, M. F., Sarker, F., Huda, M. N., & Abdullah- Al-Mamun, K.(2016). Automated Bangla sign language translation system: Prospects, limitations andapplications. Paper presented at the Informatics, Electronics and Vision (ICIEV), 2016 5thInternational Conference on.Hussain, M., Al-Haiqi, A., Zaidan, A., Zaidan, B., Kiah, M. L. M., Anuar, N. B., & Abdulnabi, M.(2015). The landscape of research on smartphone medical apps: Coherent taxonomy, motivations, open challenges and recommendations. Computer methods and programs in biomedicine, 122(3),393-408.Ibarguren, A., Maurtua, I., & Sierra, B. (2009). Layered architecture for real-time signrecognition. The Computer Journal, 53(8), 1169-1183.Ibarguren, A., Maurtua, I., & Sierra, B. (2010). Layered architecture for real time signrecognition: Hand gesture and movement. Engineering Applications of ArtificialIntelligence, 23(7), 1216-1228.Iwasako, K., Soga, M., & Taki, H. (2014). Development of finger motion skill learning supportsystem based on data gloves. Procedia Computer Science, 35, 1307- 1314.Jadhav, A. J., & Joshi, M. P. (2016). AVR based embedded system for speech impaired people. Paperpresented at the Automatic Control and Dynamic Optimization Techniques (ICACDOT), InternationalConference on.Kadam, K., Ganu, R., Bhosekar, A., & Joshi, S. (2012). American sign languageinterpreter. Paper presented at the Technology for Education (T4E), 2012 IEEE Fourth InternationalConference on.Kanwal, K., Abdullah, S., Ahmed, Y. B., Saher, Y., & Jafri, A. R. (2014). Assistive Glove forPakistani Sign Language Translation. Paper presented at the Multi- Topic Conference (INMIC), 2014IEEE 17th International.Kapandji, A. I. (2008). The physiology of the joints, Volume3: The spinal column,pelvic girdle and head. Edinburgh: Churchill Livingstone.Kau, L.-J., Su, W.-L., Yu, P.-J., & Wei, S.-J. (2015). A real-time portable sign languagetranslation system. Paper presented at the Circuits and Systems (MWSCAS), 2015 IEEE 58thInternational Midwest Symposium on.Khambaty, Y., Quintana, R., Shadaram, M., Nehal, S., Virk, M. A., Ahmed, W., &Ahmedani, G. (2008). Cost effective portable system for sign language gesture recognition. Paperpresented at the System of Systems Engineering, 2008. SoSE'08. IEEE International Conferenceon.Khan, S., Gupta, G. S., Bailey, D., Demidenko, S., & Messom, C. (2009). Sign languageanalysis and recognition: A preliminary investigation. Paper presented at the Imageand Vision Computing New Zealand, 2009. IVCNZ'09. 24th International Conference.Kim, J., Wagner, J., Rehm, M., & Andr, E. (2008). Bi-channel sensor fusion forautomatic sign language recognition. Paper presented at the Automatic Face & Gesture Recognition,2008. FG'08. 8th IEEE International Conference on.Kong, W., & Ranganath, S. (2008). Signing exact english (SEE): Modeling andrecognition. Pattern Recognition, 41(5), 1638-1652.Kong, W., & Ranganath, S. (2014). Towards subject independent continuous sign languagerecognition: A segment and merge approach. Pattern Recognition, 47(3), 1294-1308.Kortier, H. G., Sluiter, V. I., Roetenberg, D., & Veltink, P. H. (2014). Assessment of hand kinematics using inertial and magnetic sensors. Journal of neuroengineeringand rehabilitation, 11(1), 70.Kosmidou, V. E., & Hadjileontiadis, L. J. (2009). Sign language recognition usingintrinsic-mode sample entropy on sEMG and accelerometer data. IEEE transactions onbiomedical engineering, 56(12), 2879-2890.Kumar, P., Gauba, H., Roy, P. P., & Dogra, D. P. (2017). A multimodal framework for sensor basedsign language recognition. Neurocomputing, 259, 21-38.LaViola, J. (1999). A survey of hand posture and gesture recognition techniques and technology.Brown University, Providence, RI, 29.Lee, J., & Kunii, T. L. (1995). Model-based analysis of hand posture. IEEE Computer Graphics andapplications, 15(5), 77-86.Lei, L., & Dashun, Q. (2015). Design of data-glove and Chinese sign languagerecognition system based on ARM9. Paper presented at the Electronic Measurement & Instruments (ICEMI), 2015 12th IEEE International Conference on.Lokhande, P., Prajapati, R., & Pansare, S. (2015). Data Gloves for Sign LanguageRecognition System. International Journal of Computer Applications, 11-14.Lpez-Noriega, J. E., Fernndez-Valladares, M. I., & Uc-Cetina, V. (2014). Glove- basedsign language recognition solution to assist communication for deaf users. Paper presentedat the Electrical Engineering, Computing Science and Automatic Control (CCE), 2014 11thInternational Conference on.Luqman, H., & Mahmoud, S. A. (2017). Transform-based Arabic sign language recognition.Procedia Computer Science, 117, 2-9.Majid, M. B. A., Zain, J. B. M., & Hermawan, A. (2015). Recognition of Malaysian sign languageusing skeleton data with neural network. Paper presented at the Science in Information Technology(ICSITech), 2015 International Conference on.Malaysia, K. P. (1985). Komunikasi Seluruh Bahasa Malaysia Kod Tangan: Jilid 1:Kuala Lumpur: Dewan Bahasa dan Pustaka.Malaysia, P. O. P. (2000). Bahasa Isyarat Malaysia. Penerbit Persekutuan Orang PekakMalaysia.Mttelki, P., Pataki, M., Turbucz, S., & Kovcs, L. (2014). An assistive interpreter tool usingglove-based hand gesture recognition. Paper presented at the Humanitarian Technology Conference-(IHTC), 2014 IEEE Canada International.McGuire, R. M., Hernandez-Rebollar, J., Starner, T., Henderson, V., Brashear, H., & Ross, D. S.(2004). Towards a one-way American sign language translator. Paper presented at the Automatic Face and Gesture Recognition, 2004. Proceedings. Sixth IEEE InternationalConference on.Mehdi, S. A., & Khan, Y. N. (2002). Sign language recognition using sensor gloves. Paper presentedat the Neural Information Processing, 2002. ICONIP'02. Proceedings of the 9th InternationalConference on.Mohandes, M., & Deriche, M. (2013). Arabic sign language recognition by decisions fusion usingDempster-Shafer theory of evidence. Paper presented at the Computing, Communications and ITApplications Conference (ComComAp), 2013.Nair, S., De La Vara, J. L., Sabetzadeh, M., & Briand, L. (2014). An extendedsystematic literature review on provision of evidence for safety certification. Information andSoftware Technology, 56(7), 689-717.orgnanization;, w. h. (Fact sheet Updated February 2017;). Deafness and hearing loss;. 01-Sep-2017, from http://www.who.int/mediacentre/factsheets/fs300/en/#contentOrphanides, A. K., & Nam, C. S. (2017). Touchscreen interfaces in context: a systematicreview of research into touchscreens across settings, populations, and implementations.Applied ergonomics, 61, 116-143.Oszust, M., & Wysocki, M. (2013). Recognition of signed expressions observed by KinectSensor. Paper presented at the Advanced Video and Signal Based Surveillance (AVSS), 201310th IEEE International Conference on.Oz, C., & Leu, M. C. (2007). Linguistic properties based on American Sign Language isolated wordrecognition with artificial neural networks using a sensory glove and motion tracker.Neurocomputing, 70(16-18), 2891-2901.Oz, C., & Leu, M. C. (2011). American Sign Language word recognition with a sensory glove usingartificial neural networks. Engineering Applications of ArtificialIntelligence, 24(7), 1204-1213.Phi, L. T., Nguyen, H. D., Bui, T. Q., & Vu, T. T. (2015). A glove-based gesturerecognition system for Vietnamese sign language. Paper presented at the Control, Automation and Systems (ICCAS), 2015 15th International Conference on.P?awiak, P., So?nicki, T., Nied?wiecki, M., Tabor, Z., & Rzecki, K. (2016). Hand body languagegesture recognition based on signals from specialized glove and machine learningalgorithms. IEEE Transactions on Industrial Informatics, 12(3), 1104-1113.Pourmirza, S., Peters, S., Dijkman, R., & Grefen, P. (2017). A systematic literaturereview on the architecture of business process management systems. InformationSystems, 66, 43-58.Pradhan, G., Prabhakaran, B., & Li, C. (2008). Hand-gesture computing for the hearing and speechimpaired. IEEE MultiMedia, 15(2).Praveen, N., Karanth, N., & Megha, M. (2014). Sign language interpreter using a smart glove. Paper presented at the Advances in Electronics, Computers and Communications(ICAECC), 2014 International Conference on.Preetham, C., Ramakrishnan, G., Kumar, S., Tamse, A., & Krishnapura, N. (2013). Handtalk-implementation of a gesture recognizing glove. Paper presented at the India Educators'Conference (TIIEC), 2013 Texas Instruments.Ramli, S. (2012). GMT feature extraction for representation of BIM sign language. Paperpresented at the Control and System Graduate Research Colloquium (ICSGRC), 2012 IEEE.Rishikanth, C., Sekar, H., Rajagopal, G., Rajesh, R., & Vijayaraghavan, V. (2014).Low-cost intelligent gesture recognition engine for audio-vocally impaired individuals. Paper presented at the Global Humanitarian Technology Conference (GHTC), 2014 IEEE.Sadek, M. I., Mikhael, M. N., & Mansour, H. A. (2017). A new approach for designing a smart glovefor Arabic Sign Language Recognition system based on the statistical analysis of the SignLanguage. Paper presented at the Radio Science Conference (NRSC), 2017 34th National.Sagawa, H., & Takeuchi, M. (2000). A method for recognizing a sequence of signlanguage words represented in a japanese sign language sentence. Paper presented at theAutomatic Face and Gesture Recognition, 2000. Proceedings.Fourth IEEE International Conference on.Sekar, H., Rajashekar, R., Srinivasan, G., Suresh, P., & Vijayaraghavan, V. (2016).Low-cost intelligent static gesture recognition system. Paper presented at the Systems Conference(SysCon), 2016 Annual IEEE.Sharma, D., Verma, D., & Khetarpal, P. (2015). LabVIEW based Sign Language Trainer cumportable display unit for the speech impaired. Paper presented at the India Conference (INDICON),2015 Annual IEEE.Sharma, V., Kumar, V., Masaguppi, S. C., Suma, M., & Ambika, D. (2013). Virtual Talk for Deaf,Mute, Blind and Normal Humans. Paper presented at the India Educators' Conference (TIIEC), 2013Texas Instruments.Shukor, A. Z., Miskon, M. F., Jamaluddin, M. H., bin Ali, F., Asyraf, M. F., & bin Bahar, M. B.(2015). A new data glove approach for Malaysian sign language detection. Procedia Computer Science,76, 60-67.Sidek, O., & Hadi, M. A. (2014). Wireless gesture recognition system using MEMSaccelerometer. Paper presented at the Technology Management and Emerging Technologies (ISTMET),2014 International Symposium on.Sriram, N., & Nithiyanandham, M. (2013). A hand gesture recognition basedcommunication system for silent speakers. Paper presented at the Human ComputerInteractions (ICHCI), 2013 International Conference on.Swee, T. T., Ariff, A., Salleh, S.-H., Seng, S. K., & Huat, L. S. (2007). Wireless data gloves Malay sign language recognition system. Paper presented at the Information,Communications & Signal Processing, 2007 6th International Conference on.Swee, T. T., Salleh, S.-H., Ariff, A., Ting, C.-M., Seng, S. K., & Huat, L. S. (2007). Malay SignLanguage gesture recognition system. Paper presented at the Intelligent and AdvancedSystems, 2007. ICIAS 2007. International Conference on.Tanyawiwat, N., & Thiemjarus, S. (2012). Design of an assistive communication glove using combinedsensory channels. Paper presented at the Wearable and Implantable Body Sensor Networks (BSN), 2012 Ninth International Conference on.Trottier-Lapointe, W., Majeau, L., El-Iraki, Y., Loranger, S., Chabot-Nobert, G., Borduas,J., . . . Lapointe, J. (2012). Signal processing for low cost optical dataglove. Paperpresented at the Information Science, Signal Processing andtheir Applications (ISSPA), 2012 11th International Conference on.Tubaiz, N., Shanableh, T., & Assaleh, K. (2015). Glove-based continuous Arabic sign languagerecognition in user-dependent mode. IEEE Transactions on Human- Machine Systems, 45(4), 526-533.Vijay, P. K., Suhas, N. N., Chandrashekhar, C. S., & Dhananjay, D. K. (2012). Recent developmentsin sign language recognition: A review. Int J Adv Comput Eng Commun Technol, 1, 21-26.Vijayalakshmi, P., & Aarthi, M. (2016). Sign language to speech conversion. Paper presentedat the Recent Trends in Information Technology (ICRTIT), 2016 International Conference on.Vutinuntakasame, S., Jaijongrak, V.-r., & Thiemjarus, S. (2011). An assistive body sensornetwork glove for speech-and hearing-impaired disabilities. Paper presented at the BodySensor Networks (BSN), 2011 International Conference on.Zhang, X., Chen, X., Li, Y., Lantz, V., Wang, K., & Yang, J. (2011). A framework for hand gesturerecognition based on accelerometer and EMG sensors. IEEE Transactions on Systems, Man, andCybernetics-Part A: Systems and Humans,41(6), 1064-1076.