Abstract
The main objective of this work is to carry out a proposal of an interactive software system (ISS) design that makes use a Hardware device (Hw) “Leap Motion” and is centered average users as an interactive solution for communication with deaf people is feasible.
With this proposal we hoped to achieve a natural recognition of hand movements, in this way will be obtained a support system for the average user in the learning of Mexican Sign Language (MSL) and, through gamification techniques applied to the ISS, the user can learn and communicate with a person with hearing impairment.
To carry out this proposal we review the literature, in general we can observed that several of the papers consider another type of sign language or another technique for the recognition of signs, therefore, the number of papers that specifically focus on the “Mexican Sign Language” learning and that use the Hw of “Leap Motion” is considerably reduced.
Which allows us to conclude that the proposal for the design of an ISS for training in the learning of Mexican Sign Language for average people is feasible not only by using an Hw tool that allows communication and interpretation of MSL and gamification techniques, It also has the support of related papers which guarantee that the “Leap Motion” is a tool that can be used for such action.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
- Access to education and learning
- Design for all education and training
- Design for quality of life technologies
- Evaluation of accessibility
- Usability
- User experience
1 Introduction
Mexico’s National Survey of Demographic Dynamics (ENADID) reported in 2014 that there are 7,184,054 persons with disabilities in Mexico – or 6% of the population - 2,405,855 of whom have hearing disabilities [1].
People with disabilities face daily and routine barriers to full social participation. Deaf people face a particular barrier when they are invited to an event (of any kind) as communication with others present is difficult. The presence of sign language interpreters is rare in any part of the world, but in Mexico, it is particularly endemic as the most recent date suggests that there are approximately 40 certified MSL interpreters [2].
Providing the necessary tools so that an average person can learn MSL could therefore go a long way to overcoming the problems associated with so few certified interpreters with such a large number of hearing-impaired persons.
There have been studies on the use of hardware tools as support in the field of computer science. Our literature review revealed that there are little to no published works specifically related to Mexican Sign Language and the use of a hardware device, including the “Leap Motion” device, for learning it.
This paper proposes the following:
-
Leap Motion (and other hardware) can be used to learn, and to support learning, of MSL (Based on the literature review).
-
This can be done by average, or non-professional persons.
-
Supporting the average user also provides the convenience to collaterally support and enhance communication between and the hearing impaired and the non-hearing impaired.
2 Mexican Sign Language and Related Works
Mexican sign language – which originates from the French sign language family – is used and “spoken” by an estimated 130,000 people [3].
As with any language, it has its own syntax, grammar, lexicon [4]; Mexicans express themselves and communicate with visual within a Mexican linguistic structure. Hand gestures and movement are very important as it represents a large percentage of communication. The movement of hands, body, and arms, the forms that are made, the orientation, as well as the facial gestures all allow the interlocutor to better understand meanings, emotions, emphasis, along with the content of communication that is conveyed [4].
A sign language consists of three main parts:
-
1.
Manual features.
-
2.
Non-manual features.
-
3.
Finger spelling.
Hence, a sign language involves several features which have, as main component, the hands [5].
The Mexican Sign Language consists of a series of articulated gestures signs with a linguistic function that is part of the linguistic heritage of the community and is as complex in grammar and vocabulary as any oral language.
Although sign languages are very useful, there is still a barrier between speakers and deaf people because the recipient of the message must know the symbols in order to understand it. As a consequence, the communication, and the process of communicating, can either break down or be non-existant.
Taking into account the above, we consider it important that a hearing person has access to learning Mexican Sign Language naturally (using a non-invasive Hw device) through an interactive software system (ISS). As a consequence, we expect an increase in the number of people who know and can use MSL which will in turn promote and facilitate the inclusion of Mexico’s hearing-impaired population.
2.1 Hand Alphabet
MSL studies can be grouped into two categories: static sign recognition and dynamic sign recognition [6]. Static signs include most of the letters of the hand alphabet that are made with a specific fixed posture of the hand. Dynamic signs involve the movement of one or both hands to perform a letter of the manual alphabet, a word or a statement.

Source: [6]
MSL alphabet.
MSL is composed of more than 1000 different signs which have regional variations, MSL hand alphabet has 29 signs (see Fig. 1) corresponding to letters: A, B, C, D, E, F, G, H, I, J, K, L, LL, M, N, Ñ, O, P, Q, R, S, T, U, V, W, X, Y, Z.
This work focuses on a first phase of learning the alphabet and greetings. Therefore, the aim is an ISS that focuses on the recognition of static signs, as is the case of [5, 7,8,9,10,11].
Much of the existing literature focuses on artificial intelligence (AI), fuzzy logic and various sensors, including the Kinect. Others focus on aspects such as number learning [13], medical issues [14,15,16], videogames [17, 18], or another type of sign language [19,20,21,22], who perform hand gesture recognition with a sensor known as “Leap Motion”, which has a very good response to the development of systems that involve hand gestures.
2.2 Leap Motion
The Leap Motion Controller (LM) is an optical hand tracking module that captures the movements of the hands with unparalleled accuracy and use a simple sensor that detects the hands (see Fig. 2) [23].

Source: [23]
Sensors of Leap Motion Controller that recognize the hands
The Leap Motion Controller in conjunction with the hand tracking software captures all the subtlety and complexity of natural hand movements [23]. The LEDs of the Leap Motion Controller illuminate the hands with infrared light (see Fig. 3).
According to the manufacturer, the Leap Motion Controller is designed to provide real-time tracking of hands and fingers in a tridimensional space with 0.01-mm accuracy, according to the manufactured.
The positions of the hands and fingertips are detected in coordinates relative to the center of the controller, taking as reference the right-handed coordinate system and millimeters as the unit. Several gestures can be natively identified by the LM such us swipe gesture, circle gesture, key tap gesture and screen tap gesture [24].
The characteristics of the LM have resulted in a number of studies and papers on the recognition of hand gestures [13, 25, 26] to name a few. These studies, among others, highlight current interest in the recognition of hand gestures regarding Human-Computer Interaction (HCI) [27].

Source: [23]
Recognizing the hands with infrared light.

Source: [3]
A) Usage of different data acquisition techniques used in MSL systems. B) Research work carried out on static/dynamic signs in MSL. C) Percentage of research work carried out on the basis of singledouble handed signs in MSL.
Although there are works that specifically focus on the recognition of hand gestures using Leap Motion or Microsoft Kinect in different areas, none of them focuses on the recognition of the Mexican sign language alphabet using Leap Motion. The majority of papers in Mexican Sign Language has been performed using Kinect (67%) followed by camera (33%), (see Fig. 4) [3].
Further, works on MSL have been done for static signs (67%) followed by dynamic signs (33%). Therefore, 100% of the research has been performed on isolated signs only. Finally, it has been observed that 67% of work in MSL has been performed on single handed signs and followed by 33% using both single and double handed signs as shown in Fig. 4 [3].
Based on this, and the features of Leap Motion, we propose the design of an ISS that supports the learning of Mexican Sign Language among the non-hearing impaired.
3 Proposal of an Interactive Software System Design for Learning the MSL
Key to this proposal is the basis of its design. The ISS will be designed for the non-hearing impaired, who will make use of a Natural User Interface (NUI). Using Ben Schneiderman’s 8 Golden Rules of Interface Design will help guarantee that the interface will be intuitive for the user.
Shneiderman’s collection of principles are derived heuristically from experience and applicable in most interactive systems after being refined, extended, and interpreted [28]:
-
Strive for consistency.
-
Enable frequent users to use shortcuts.
-
Offer informative feedback.
-
Design dialog to yield closure.
-
Offer simple error handling.
-
Permit easy reversal of actions.
-
Support internal locus of control.
-
Reduce short-term memory load.
This research will adopt the eight Golden Rules of Interface design for the design. For example, to comply with the first rule, consistency, the system’s icons, menu, color, among other things, will be consistent.
3.1 Proposal of Design for the ISS
This ISS design proposal will use the Leap Motion sensor and workstation (PC) (see Fig. 5). The function of the LM sensor is to recognize the hand gestures corresponding to the MSL, in turn, the PC will process the ISS with which the user will have a natural interaction.
The main idea is to manage levels within the ISS by applying gamification techniques to promote learning in users who are interested in learning MSL. In this way the user will gradually learn different aspects of the MSL by interacting with the system through the Hw Leap Motion device.
To follow Schneiderman’s design rules we propose the use of an intuitive main (see Fig. 6) where the user will can select a “basic” level or to undertake and exam for a more “advanced” level (see Fig. 7).
After the user enters a level, a series of exercises will be shown that must be performed until the necessary points are obtained to continue with the next lesson and so on until the next level is unlocked (see Fig. 8).
Each time the user finishes a lesson or level, the user will be informed with a message (see Fig. 9).
If you make a mistake somewhere in the lesson, you will be given the option to see the solution and do a practice exercise (see Fig. 10).
Regarding the operation of the ISS with Leap Motion, when the user is in a lesson and is asked to make a particular signal, the LM will obtain a representation of the hand gesture which will compare with that stored in the system database (DB) and if they are equal, the exercise will be marked as correct, otherwise a life will be subtracted and you can continue with the lesson. The representation of the skeleton of the hand created with LM is shown in Fig. 11 and 12. This representation of the hand gesture is analyzed in the ISS to compare and validate the exercise.
As it can be observed, the intention is that the ISS complies with the bases of the design rules for interfaces recommended by Schneiderman and that at the same time the interaction is naturally for the hearing user.
Points to Evaluate from the ISS Design Proposal. The expectation is that the ISS design proposal and Shneiderman’s interface design rules will ensure a user-friendly interface.
Therefore, in the proposal the first golden rule “Strive for consistency” can be observed in the use of familiar icons to the user, in the colors and in the hierarchy of the elements on the screen (see Fig. 7, 8, 9 and 10).
Figure 7 demonstrates the second rule - enable frequent users to use shortcuts - in the form a button that allows one to jump a level and another that returns to the beginning.
Figures 8 and 9 illustrate the third rule - information feedbacks - via an information bar highlighting the user’s progress.
Ensuring messages are visible to the user will ensure compliance with the fourth rule - “Design dialog to yeld closure” - (see Fig. 9).
The user will have the opportunity to learn and review the element which will in turn give him or her a sense of control over the ISS. This will permit easy reversal of actions (both the fifth and sixth rules).
In order to comply with the seventh rule, the minimum of icons is used in the interface design using the spaces in the appropriate manner, thus trying to make the interface as simple as possible for the user, thus avoiding that it is difficult to remember a way to reach the option you are looking for within the ISS.
At first glance, the ISS proposal will therefore comply with the 8 Schneiderman rules. This will be evaluated by using the “Thinking Aloud” method, this method evaluates a set of tasks the user must perform to verify that all of the elements in the system design proposal are intuitive and can solve any situation.
4 Conclusions and Future Works
The Leap Motion Controller is an effective hardware tool in recognizing hand gestures in different areas. We propose that this tool, when used in tandem with a natural user interface, will allow an average user to learn Mexican Sign Language.
Phase 1 and Phase 2 (which is pending) would involve the “interface” and the “DB of Mexican Sign Language” respectively. Phase 3 would correspond with the analysis and design validation of the ISS. It may be that the design rules can be achieved and seen with the naked eye, however, corresponding validation tests must be performed for greater certainty.
References
SEDESOL-CONADIS La Sordoceguera en México: datos por el 27 de junio, día internacional de la sordoceguera. https://www.gob.mx/conadis/es/articulos/la-sordoceguera-en-mexico-datos-por-el-27-de-junio-dia-internacional-de-la-sordoceguera?idiom=es. Accessed 6 Jan 2020
SIPSE En México se hacen ciegos ante los sordos. https://sipse.com/mexico/sordos-discapacidad-gobierno-mexico-224324.html. Accessed 6 Jan 2020
Wadhawan, A., Kumar, P.: Sign language recognition systems: a decade systematic literature review. Arch. Comput. Methods Eng. 1–29 (2019)
Gobierno de México. https://www.gob.mx/conadis/articulos/lengua-de-senas-mexicana-lsm?idiom=es. Accessed 6 Jan 2020
Trujillo-Romero, F., Caballero-Morales, S.-O.: 3D data sensing for hand pose recognition. In: CONIELECOMP 2013, 23rd International Conference on Electronics, Communications and Computing, pp. 109–113. IEEE (2013)
Sosa-Jímenez, C., Ríos-Figueroa, H., Rechy-Ramírez, E., Marin-Hernandez, A., González-Cosío, A.: Real-time Mexican Sign Language recognition. In: 2017 IEEE International Autumn Meeting on Power, Electronics and Computing (ROPEC), pp. 1–6. IEEE (2017)
Luis-Pérez, F.E., Trujillo-Romero, F., Martínez-Velazco, W.: Control of a service robot using the Mexican sign language. In: Batyrshin, I., Sidorov, G. (eds.) MICAI 2011. LNCS (LNAI), vol. 7095, pp. 419–430. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25330-0_37
Solís, J.-F., Toxqui-Quitl, C., Martínez-Martínez, D., Margarita, H.-G.: Mexican sign language recognition using normalized moments and artificial neural networks. In: Optics and Photonics for Information Processing VIII, pp. 92161A. ISOP (2014)
Galicia, R., Carranza, O., Jiménez, E.-D., Rivera, G.-E.: Mexican sign language recognition using movement sensor. In: 2015 IEEE 24th International Symposium on Industrial Electronics (ISIE), pp. 573–578. IEEE (2015)
López, E., Velásquez, J., Eleuterio, R., Gil, L.: Interfaz de reconocimiento de movimientos para el lenguaje de señas mexicano implementando el Kinect. Revista Aristas: Investigación Básica y Aplicada 4(7), 130–133 (2015)
Solís, F., Martínez, D., Espinoza, O.: Automatic Mexican sign language recognition using normalized moments and artificial neural networks. Engineering 8(10), 733–740 (2016)
Jimenez, J., Martin, A., Uc, V., Espinosa, A.: Mexican sign language alphanumerical gestures recognition using 3D Haar-like features. IEEE Latin Am. Trans. 15(10), 2000–2005 (2017)
Wang, Q., Wang, Y., Liu, F., Zeng, W.: Hand gesture recognition of Arabic numbers using leap motion via deterministic learning. In: 2017 36th Chinese Control Conference (CCC), pp. 10873–10828. IEEE (2017)
Li, W.-J., Hsieh, C.-Y., Lin, L.-F., Chu, W.-C.: Hand gesture recognition for post-stroke rehabilitation using leap motion. In: 2017 International Conference on Applied System Innovation (ICASI), pp. 386–388. IEEE (2017)
Morando, M., Ponte, S., Ferrara, E., Dellepiane, S.: Biophysical and motion features extraction for an effective home-based rehabilitation. In: Proceedings of the International Conference on Bioinformatics Research and Applications 2017, pp. 79–85. ACM (2017)
Nicola, S., Stoicu-Tivadar, L., Virag, I., Crişan-Vida, M.: Leap motion supporting medical education. In: 2016 12th IEEE International Symposium on Electronics and Telecommunications (ISETC), pp. 153–156. IEEE (2016)
Dzikri, A., Kurniawan, D.-E.: Hand gesture recognition for game 3D object using the leap motion controller with backpropagation method. In: 2018 International Conference on Applied Engineering (ICAE), pp. 1–5. IEEE (2018)
Zhi, D., de Oliveira, T.-E.-A., da Fonseca, V.-P., Petriu, E.-M.: Teaching a robot sign language using vision-based hand gesture recognition. In: 2018 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), pp. 1–6. IEEE (2018)
Mapari, R., Kharat, G.: American static signs recognition using leap motion sensor. In: Proceedings of the Second International Conference on Information and Communication Technology for Competitive Strategies, p. 67. ACM (2016)
Kotsidou, D., Angelis, C., Dragoumanos, S., Kakarountas, A.: Computer assisted gesture recognition for the Greek sign language/fingerspelling. In: Proceedings of the 19th Panhellenic Conference on Informatics, pp. 241–242. ACM (2015)
Chavan, P., Ghorpade, T., Padiya, P.: Indian sign language to forecast text using leap motion sensor and RF classifier. In: 2016 Symposium on Colossal Data Analysis and Networking (CDAN), pp. 1–5. IEEE (2016)
Anwar, A., Basuki, A., Sigit, R., Rahagiyanto, A., Zikky, M.: Feature extraction for Indonesian sign language (SIBI) using leap motion controller. In: 2017 21st International Computer Science and Engineering Conference (ICSEC), pp. 1–5. IEEE (2017)
Ultraleap. https://www.ultraleap.com/product/leap-motion-controller/. Accessed 13 Jan 2020
Ameur, S., Khalifa, A.-B., Bouhlel, M.-S.: A comprehensive leap motion database for hand gesture recognition. In: 2016 7th International Conference on Sciences of Electronics, Technologies of Information and Telecommunications (SETIT), pp. 514–519. IEEE (2016)
Sharma, A., Yadav, A., Srivastava, S., Gupta, R.: Analysis of movement and gesture recognition using leap motion controller. Procedia Comput. Sci. 132, 551–556 (2018)
Zeng, W., Wang, C., Wang, Q.: Hand gesture recognition using leap motion via deterministic learning. Multimedia Tools Appl. 77(21), 28185–28206 (2018). https://doi.org/10.1007/s11042-018-5998-1
Chaudhary, A., Raheja, J.-L., Das, K., Raheja, S.: Intelligent approaches to interact with machines using hand gesture recognition in natural way: a survey. arXiv preprint arXiv:1303.2292 (2013)
Yamakami, T.: A four-stage gate-keeper model of social service engineering: lessons from golden rules of mobile social game design. In: 2012 9th International Conference on Ubiquitous Intelligence and Computing and 9th International Conference on Autonomic and Trusted Computing, pp. 159–163. IEEE (2012)
Acknowledgement
I want to thank E. Girard for support in the revision/correction of this paper and in the case of A. Cruz for making all the convenient reviews to improve it.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Alvarez-Robles, T., Álvarez, F., Carreño-León, M. (2020). Proposal for an Interactive Software System Design for Learning Mexican Sign Language with Leap Motion. In: Stephanidis, C., Antona, M., Gao, Q., Zhou, J. (eds) HCI International 2020 – Late Breaking Papers: Universal Access and Inclusive Design. HCII 2020. Lecture Notes in Computer Science(), vol 12426. Springer, Cham. https://doi.org/10.1007/978-3-030-60149-2_15
Download citation
DOI: https://doi.org/10.1007/978-3-030-60149-2_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-60148-5
Online ISBN: 978-3-030-60149-2
eBook Packages: Computer ScienceComputer Science (R0)