Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

A Novel Assistive Glove to Convert Arabic Sign Language into Speech

Published: 21 February 2023 Publication History

Abstract

People with speech disorders often communicate through special gestures and sign language gestures. However, other people around them might not understand the meaning of those gestures. The research described in this article is aimed at providing an assistive device to help those people communicate with others by translating their gestures into a spoken voice that others can understand. The proposed device includes an electronic glove that is worn on the hand. It employs an MPU6050 accelerometer/gyro with 6 degrees of freedom to continuously monitor hand orientation and movement, plus a potentiometer for each finger, to monitor changes in finger posture. The signals from the MPU6050 and the potentiometers are routed to an Arduino board, where they are processed to determine the meaning of each gesture, which is then voiced using the audio streams stored in an SD memory card. The audio output drives a speaker, allowing the listener to understand the meaning of each gesture. We built a database with the help of 10 deaf people who cannot speak. We asked them to wear the glove while performing a set of 40 Arabic sign language words and recorded the resulting data stream from the glove. That data was then used to train seven different learning algorithms. The results showed that the Decision Tree learning algorithm achieved the highest accuracy of 98%. A usability study was then conducted to determine the usefulness of the assistive device in real-time.

References

[1]
Zulfiqar Ali Memon et al. 2017. Real time translator for sign languages. In International Conference on Frontiers of Information Technology (FIT).
[2]
Shreyashi Narayan Sawant. 2014. Sign language recognition system to aid deaf-dumb people using PCA. Int. J. Comput. Sci. Eng. Technol. 5, 05 (2014).
[3]
Surbhi Rathi and Ujwalla Gawande. 2017. Development of full duplex intelligent communication system for deaf and dumb people. In 7th International Conference on Cloud Computing, Data Science & Engineering-Confluence. IEEE, 2017.
[4]
P. Subha Rajam and G. Balakrishnan. 2013. Design and development of Tamil sign alphabets using image processing with right hand palm to aid deaf-dumb people IETE J. Res. 59, 6 (2013), 709–718.
[5]
P. B. Koli et al. 2015. Image processing based language converter for deaf and dumb people. IOSR J. Electron. Commun. Eng. e-ISSN 1, 2015 (2015), 2278–2834.
[6]
Prashant G. Ahire et al. 2015. Two way communicator between deaf and dumb people and normal people. In International Conference on Computing Communication Control and Automation. IEEE, 2015.
[7]
Syed Faiz Ahmed, Syed Muhammad Baber Ali, and Sh Saqib Munawwar Qureshi. 2010. Electronic speaking glove for speechless patients, a tongue to a dumb. In IEEE Conference on Sustainable Utilization and Development in Engineering and Technology. IEEE, 2010.
[8]
Syed Atif Mehdi and Yasir Niaz Khan. 2002. Sign language recognition using sensor gloves. In 9th International Conference on Neural Information Processing.
[9]
Mukesh P. Mahajan et al. 2016. Electronic hand glove through gestures for verbally challenged persons. Int. Journal of Engineering Research and Applications 6, 4 (Part - 1) (2016), 17--19.
[10]
Ching-Hua Chuan, Eric Regina, and Caroline Guardino. 2014. American Sign Language recognition using leap motion sensor. In 13th International Conference on Machine Learning and Applications.
[11]
Chandrika Jayant et al. 2010. V-Braille: Haptic Braille perception using a touch-screen and vibration on mobile phones. In 12th International ACM SIGACCESS Conference on Computers and Accessibility.
[12]
A. Y. Satpute and A. D. Bhoi. 2013. Electronic speaking glove for speechless patients a regional tongue to a dumb. International Journal of Electrical and Computer Engineering 4, 2 (2013), 507–511.
[13]
Abjhijt Auti, V. G. Puranik, and Dr. A. K. Kureshi. 2014. Speaking gloves for speechless persons. International Journal of Innovative Research in Science, Engineering and Technology 3, 4 (2014).
[14]
M. S. Kasar, Akshada Gavande, Anvita Deshmukh, and Priyanka Ghadage. 2016. Smart speaking glove-virtual tongue for deaf and dumb. Int. J. Adv. Res. Electric. Electron. Instrum. Eng. 5, 3 (2016), 7.
[15]
Syed Faiz Ahmed, Syed Muhammad Baber Ali, and Sh Saqib Munawwar Qureshi. 2010. Electronic speaking glove for speechless patients, a tongue to a dumb. In IEEE Conference on Sustainable Utilization and Development in Engineering and Technology.
[16]
Abdullah Al Mamun, Md Sarwar Jahan Khan Polash, and Fakir Mashuque Alamgir. 2017. Flex sensor based hand glove for deaf and mute people. Int. J. Comput. Netw. Commun. Secur. 5, 2 (2017), 38.
[17]
Helene Brashear et al. 2003. Using multiple sensors for mobile sign language recognition. In Proceeding of the 7th IEEE International Symposium on Wearable Computers (ISWC'03), Georgia Institute of Technology, White Plains, New York.
[18]
Chandrika Jayant et al. 2010. V-Braille: Haptic Braille perception using a touch-screen and vibration on mobile phones. In 12th International ACM SIGACCESS Conference on Computers and Accessibility.
[19]
Sruthi Dinesh. 2015. Talking glove—A boon for the deaf, dumb and physically challenged. Int. J. Adv. Res. Electron. Commun. Eng. 4, 5 (2015), 1366–1369.
[20]
Rohit Rastogi, Shashank Mittal, and Sajan Agarwal. 2015. A novel approach for communication among blind, deaf and dumb people. In 2nd International Conference on Computing for Sustainable Global Development (INDIACom).
[21]
Dalia Nashat et al. 2014. An Android application to aid uneducated deaf-dumb people. Int. J. Comput. Sci. Mob. Applic. 2, 9 (2014), 1–8.
[22]
Peter B. Shull et al. 2019. Hand gesture recognition and finger angle estimation via wrist-worn modified barometric pressure sensing. IEEE Trans. Neural Syst. Rehab. Eng. 27, 4 (2019), 724–732.
[23]
S. B. Shrote et al. 2014. Assistive translator for deaf & dumb people. Int. J. Electron. Commun. Comput. Eng. 5, 4 (2014), 86–89.
[24]
H. V. Anupreethi and S. Vijayakumar. 2012. MSP430 based sign language recognizer for dumb patients. Procedia Eng. 38 (2012), 1374–1380.
[25]
J. Bukhari et al. 2015. American Sign Language translation through sensory glove; SignSpeak. Int. J. u-and e-Serv. Sci. Technol. 8, 1 (2015), 131–142.
[26]
Nikita P. Nagori and Vandana Malode. 2016. Communication interface for deaf-mute people using Microsoft Kinect. In International Conference on Automatic Control and Dynamic Optimization Techniques (ICACDOT).
[27]
Shihab Shahriar Hazari, Lamia Alam, and Nasim Al Goni. 2017. Designing a sign language translation system using kinect motion sensor device. In International Conference on Electrical, Computer and Communication Engineering (ECCE).
[28]
Qiguang Miao et al. 2017. Multimodal gesture recognition based on the ResC3D network. In IEEE International Conference on Computer Vision Workshops.
[29]
Ricky Anderson et al. 2017. Sign language recognition application systems for deaf-mute people: A review based on input-process-output. Procedia Comput. Sci. 116 (2017), 441–448.
[30]
William C. Stokoe Jr. 2005. Sign language structure: An outline of the visual communication systems of the American deaf. J. Deaf Stud. Deaf Educ. 10, 1 (2005), 3–37.
[31]
Edwin Escobedo, Lourdes Ramirez, and Guillermo Camara. 2019. Dynamic sign language recognition based on convolutional neural networks and texture maps. In 32nd SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI).
[32]
Junghee Lee, A. Elzawawy, and H. A. Rahemi. 2014. A. Ds: Aid device for deaf drivers. In 12th Latin American and Caribbean Conference for Engineering and Technology (LACCEI).
[33]
Josh Guinn et al. Emergency Vehicle Alert Device (EVADE). https://www.ece.ucf.edu/seniordesign/su2010fa2010/gd/cp.pdf.
[34]
Aniket Padgilwar and Yuga Borkar. 2014. Cars for deaf people. Int. Rev. Appl. Eng. Res. 4, 4 (2014), 307–312.
[35]
I. H. Witten, E. Frank, L. Trigg, M. Hall, G. Holmes, and S. J. Cunningham. 1999. Weka: Practical machine learning tools and techniques with Java implementations. (Working paper 99/11). University of Waikato, Department of Computer Science, Hamilton. https://researchcommons.waikato.ac.nz/handle/10289/1040.
[36]
Aaditya Desai and R. Sunil. 2012. Analysis of machine learning algorithms using WEKA. Int. J. Comput. Applic. 975 (2012), 8887.
[37]
Memoona Khanum et al. 2015. A survey on unsupervised machine learning algorithms for automation, classification and maintenance. Int. J. Comput. Applic. 119, 13 (2015).
[38]
Seth Lloyd, Masoud Mohseni, and Patrick Rebentrost. 2013. Quantum algorithms for supervised and unsupervised machine learning. arXiv preprint arXiv:1307.0411 (2013).
[39]
G. Sahoo and Yugal Kumar. 2012. Analysis of parametric & non parametric classifiers for classification technique using WEKA. Int. J. Inf. Technol. Comput. Sci. 4, 7 (2012).
[40]
Deeman Y. Mahmood and Mohammed A. Hussein. 2013. Intrusion detection system based on K-star classifier and feature set reduction. Int. Organiz. Scient. Res. J. Comput. Eng. 15, 5 (2013), 107–112.
[41]
Weiwei Lin et al. 2017. An ensemble random forest algorithm for insurance big data analysis. IEEE Access 5 (2017), 16568–16575.
[42]
Akshay D. Deshmukh and Ulhas B. Shinde. 2016. A low cost environment monitoring system using Raspberry Pi and Arduino with Zigbee. In International Conference on Inventive Computation Technologies (ICICT).
[43]
A. Udhana, J. Rahmawan, and C. U. P. Negara. 2018. Flex sensors and MPU6050 sensors responses on smart glove for sign language translation. In IOP Conference Series: Materials Science and Engineering 403, 1 (2018), 012032. IOP Publishing.
[44]
D. S. Fedorov et al. 2015. Using of measuring system MPU6050 for the determination of the angular velocities and linear accelerations. Automat. Softw. Eng. 11, 1 (2015), 75–80.
[45]
Mokh Sholihul Hadi et al. 2018. Stand-alone data logger for solar panel energy system with RTC and SD card. J. Phys.: Conf. Series. 1028, 1 (2018).
[46]
Sushilkumar Kalmegh. 2015. Analysis of Weka data mining algorithm RepTree, Simple Cart and RandomTree for classification of Indian news. Int. J. Innov. Sci. Eng. Technol. 2, 2 (2015), 438–446.
[47]
Aditya P. Uchil, Smriti Jha, and B. G. Sudha. 2019. Vision based deep learning approach for dynamic Indian sign language recognition in healthcare. In International Conference on Computational Vision and Bio Inspired Computing.
[48]
Wenjin Tao et al. 2018. American Sign Language alphabet recognition using LM controller. In Institute of Industrial and Systems Engineers Annual Conference.
[49]
Chetna Naidu and Archana Ghotkar. 2016. Hand gesture recognition using LM controller. Int. J. Sci. Res. 5 (2016), 436–441.
[50]
Makiko Funasaka et al. 2015. Sign language recognition using LM controller. In International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA).
[51]
Babak Toghiani-Rizi et al. 2017. Static gesture recognition using LM. arXiv preprint arXiv:1705.05884 (2017).
[52]
Mohamed Mohandes, S. Aliyu, and M. Deriche. 2014. Arabic sign language recognition using the LM controller. In 23rd International Symposium on Industrial Electronics (ISIE).
[53]
Basma Hisham and Alaa Hamouda. 2017. Arabic static and dynamic gestures recognition using LM. J. Comput. Sci. 13, 8 (2017), 337–354
[54]
Bassem Khelil et al. 2016. Hand gesture recognition using LM controller for recognition of Arabic sign language. In Proceeding of the 3rd International Conference on Automation, Control, Engineering, and Computer Science (ACECS'16), Proceedings of Engineering & Technology (PET). 233--238. http://ipco-co.com/PET_Journal/Acecs-2016/39.pdf.
[55]
Sriparna Saha et al. 2013. Gesture recognition from Indian classical dance using kinect sensor. In 5th International Conference on Computational Intelligence, Communication Systems and Networks.
[56]
Wijayanti Nurul Khotimah, Yohanes Aditya Susanto, and Nanik Suciati. 2017. Combining decision tree and back propagation genetic algorithm neural network for recognizing word gestures in Indonesian sign language using Kinect. J. Theoret. Appl. Inf. Technol. 95, 2 (2017), 292.
[57]
Archana Ghotkar, Pujashree Vidap, and Kshitish Deo. 2016. Dynamic hand gesture recognition using hidden Markov model by Microsoft Kinect sensor. Int. J. Comput. Applic. 150, 5 (2016), 5–9.
[58]
Lionel Pigou et al. 2018. Beyond temporal pooling: Recurrence and temporal convolutions for gesture recognition in video. Int. J. Comput. Vis. 126, 2 (2018), 430–439.
[59]
Cullen Schaffer. 1993. Selecting a classification method by cross-validation. Mach. Learn. 13, 1 (1993), 135– 143.
[60]
Seyedamin Pouriyeh et al. 2017. A comprehensive investigation and comparison of machine learning techniques in the domain of heart disease. In IEEE Symposium on Computers and Communications (ISCC).
[61]
Julia Schwarz et al. 2010. Cord input: An intuitive, high-accuracy, multi-degree-of-freedom input method for mobile devices. In SIGCHI Conference on Human Factors in Computing Systems.
[62]
M. Lydia et al. 2013. Advanced algorithms for wind turbine power curve modeling. IEEE Trans. Sustain. Energy 4, 3 (2013), 827–835.

Cited By

View all
  • (2024)Smart Glove with Mobile Application to Detect Static Arabic Hijaiyah Hand Code for Quran Recitation2024 21st Learning and Technology Conference (L&T)10.1109/LT60077.2024.10469054(64-69)Online publication date: 15-Jan-2024
  • (2024)Survey on Hand Gestures Recognition for Sign Translation using Artificial Intelligence2024 5th International Conference on Mobile Computing and Sustainable Informatics (ICMCSI)10.1109/ICMCSI61536.2024.00047(280-287)Online publication date: 18-Jan-2024
  • (2024)Multimodal Deep Neural Networks for Robust Sign Language Translation in Real-World Environments2024 Third International Conference on Distributed Computing and Electrical Circuits and Electronics (ICDCECE)10.1109/ICDCECE60827.2024.10548470(1-6)Online publication date: 26-Apr-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Asian and Low-Resource Language Information Processing
ACM Transactions on Asian and Low-Resource Language Information Processing  Volume 22, Issue 2
February 2023
624 pages
ISSN:2375-4699
EISSN:2375-4702
DOI:10.1145/3572719
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 February 2023
Online AM: 24 June 2022
Accepted: 15 June 2022
Revision received: 05 June 2022
Received: 30 March 2022
Published in TALLIP Volume 22, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Assistive technology
  2. assistive glove
  3. deaf
  4. Arabic sign language

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)189
  • Downloads (Last 6 weeks)26
Reflects downloads up to 22 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Smart Glove with Mobile Application to Detect Static Arabic Hijaiyah Hand Code for Quran Recitation2024 21st Learning and Technology Conference (L&T)10.1109/LT60077.2024.10469054(64-69)Online publication date: 15-Jan-2024
  • (2024)Survey on Hand Gestures Recognition for Sign Translation using Artificial Intelligence2024 5th International Conference on Mobile Computing and Sustainable Informatics (ICMCSI)10.1109/ICMCSI61536.2024.00047(280-287)Online publication date: 18-Jan-2024
  • (2024)Multimodal Deep Neural Networks for Robust Sign Language Translation in Real-World Environments2024 Third International Conference on Distributed Computing and Electrical Circuits and Electronics (ICDCECE)10.1109/ICDCECE60827.2024.10548470(1-6)Online publication date: 26-Apr-2024
  • (2024)Motion Images With Positioning Information and Deep Learning for Continuous Arabic Sign Language Recognition in Signer Dependent and Independent ModesIEEE Access10.1109/ACCESS.2024.348513112(160728-160740)Online publication date: 2024
  • (2024)Internet of Things Enabled Smart Gloves Design Using ESP8266 Wifi Assisted Controller with Intelligent Sensors Association2024 International Conference on Advances in Computing, Communication and Applied Informatics (ACCAI)10.1109/ACCAI61061.2024.10602011(1-7)Online publication date: 9-May-2024
  • (2024)An ultra-low-computation model for understanding sign languagesExpert Systems with Applications10.1016/j.eswa.2024.123782249(123782)Online publication date: Sep-2024
  • (2023)Comprehensive study of Sign Language Conversion Using Machine Learning2023 10th IEEE Uttar Pradesh Section International Conference on Electrical, Electronics and Computer Engineering (UPCON)10.1109/UPCON59197.2023.10434799(1069-1075)Online publication date: 1-Dec-2023
  • (2023)Recent works in Sign Language Recognition using deep learning approach - A Survey2023 OITS International Conference on Information Technology (OCIT)10.1109/OCIT59427.2023.10430576(502-507)Online publication date: 13-Dec-2023
  • (2023)Smart Glove for Bi-lingual Sign Language Recognition using Machine Learning2023 International Conference on Intelligent Data Communication Technologies and Internet of Things (IDCIoT)10.1109/IDCIoT56793.2023.10053470(409-415)Online publication date: 5-Jan-2023
  • (2023)Aikyam: A Video Conferencing Utility for Deaf and Dumb2023 9th International Conference on Smart Computing and Communications (ICSCC)10.1109/ICSCC59169.2023.10335068(676-681)Online publication date: 17-Aug-2023
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media