Developing a Color Identification Assistance Application for Individuals with Color Blindness
DOI: https://doi.org/10.1145/3657242.3658600
INTERACCION 2024: XXIV International Conference on Human Computer Interaction, A Coruña, Spain, June 2024
Color blindness is a visual impairment that results in a distorted perception of colors. It affects approximately 8% of males and 0.5% of females, and there is no cure for this condition. Visual aids such as special glasses or mobile apps may help overcoming these limitations. However, many apps apply an image processing to the entire image captured by the camera which can be confusing, as it may also change the perception of other non-conflicting colors. In this work we present the development of a vision-based interface, a mobile application, aiming at enhancing only those colors that may cause confusion for their identification. The system presents two working modes: the filter mode which highlights the conflicting color regions merged with a personalized color and the gray-scale mode that modifies all the image to a gray-scale but the conflicting colors which will remain unmodified.
ACM Reference Format:
Miguel Vidal-Coll and Cristina Manresa-Yee. 2024. Developing a Color Identification Assistance Application for Individuals with Color Blindness. In XXIV International Conference on Human Computer Interaction (INTERACCION 2024), June 19--21, 2024, A Coruña, Spain. ACM, New York, NY, USA 5 Pages. https://doi.org/10.1145/3657242.3658600
1 INTRODUCTION
Color blindness is as color vision deficiency which affects an individual's ability to see or differentiate between certain colors. This condition affects approximately 8% of males and 0.5% of females, with males being more commonly affected due to their genetics [4].
The most prevalent forms of color blindness are red-green color blindness, which includes protanomaly, protanopia, deuteranomaly, and deuteranopia, and blue-yellow color blindness, which includes tritanomaly and tritanopia. The rarest form is complete color blindness, known as achromatopsia, where individuals see the world in shades of gray [9]. Despite the challenges it presents, individuals with color blindness develop strategies to compensate this limitation in their daily activities. However, the condition can restrict access to certain occupations and activities that rely heavily on color discrimination [19].
Color blindness can be detected through various diagnostic tests [1], the most common of which are the Ishihara color test and the Farnsworth-Munsell 100 hue test. Both tests follow a similar structure in which the user must interact with colored images, and their results are compared with those from charts, thus determining whether the user has color blindness, its type, and, in some cases, its degree.
Recent advancements in technology have led to the development of aids for individuals with color blindness such as specialized glasses and lenses [17, 20], accessibility options in the operating systems or commercial smartphone applications that enhance color discrimination (e.g. NowYouSee [5], Color Blind Pal [7] or Spectra [13]). These tools work by altering the light spectrum or the way colors are displayed. Frequently the apps apply an image processing to the entire image captured by the camera which can be confusing for the user, as it may also change the perception of other non-conflicting colors. Further, Iqbal et al. [11] commented the challenges faced by color-blind users while operating mobile-phones due to the uniform adaptive features provided by the mobile vendors.
Mining the literature, we also find research works contributing to this domain. Lau et al. [14] proposed a smartphone app based on the key idea of converting color contrasts in a static image to contrasts seen over time. As the user interacts with the device's touchscreen, the image colors are continually changing to help distinguishing between problematic colors and judge the color of objects. They transformed the pixel colors based on a simple shear and translation operation. They tested their solution with five users resulting in the improvement of the color identification. Elrefae [6] presented a smartphone based experimental comparison of color correction algorithms for protanopia, deuteranopia, and tritanopia. They compared the LMS daltonization algorithm, color-blind filter service (CBFS) algorithm, LAB color corrector algorithm, and the shifting color algorithm, but they did not evaluate them with users. Lausegger et al. [15] developed OmniColor, which used a slightly modified version on the daltonization algorithm on a Google Glass application, therefore, requiring specified hardware. They tested their application with five users presenting protanomaly, improving the identification of the Ishihara plates.
In this work we present the development of a mobile app addressed to people with color blindness. The app comprises two distinct parts. On the one hand, we implemented the Ishihara color test to allow users to find out (or confirm) if they present color blindness. On the other hand, we developed a camera interface, in which the colors that usually cause confusion for people with color blindness are modified to be easily distinguishable. We modify only the colors that cause problems according to the type of color blindness, aiming at improving the user experience compared to other existing applications. Developing tools for color blindness not only advances our understanding of human vision but also drives innovation in creating more inclusive environments and technologies that accommodate the diverse ways people perceive the world.
The work is organized as follows. Section 2 describes the app, its design and implementation. Section 3 presents a usability evaluation with users. Section 4 presents a comparison with another similar app. Finally, the concluding section synthesizes the key takeaways from this research and outlines future working lines.
2 APP FOR COLOR IDENTIFICATION
The solution proposed is an app developed for Android (version 5.0.0 (Lollipop) or later) which comprises two modules: the color test and the camera interface. We implemented the Ishihara color test [12] due to its speed and effectiveness. The test consists of a series of plates filled with dots of different colors and sizes. Within these plates, digits or patterns are embedded in a color that contrasts with the background, which can only be correctly identified if the viewer has normal color vision. From the 24-plates version, we used the set of 17 plates with digits for quick screening (see Fig: 1). Interpretation of the Ishihara test is straightforward: if the user identifies 13 or more plates, it is determined that they do not present color blindness. Conversely, correctly identifying 9 or fewer plates suggests the presence of some form of color blindness. In the rare instance where the user's correct answers fall between 10 and 12 plates, it is advisable to either retake the test or seek alternative testing methods. The test also includes plates designed to identify deuteranopia or protanopia. While highly effective for detecting red-green color blindness, this test is not designed to diagnose blue-yellow color blindness or the complete absence of color vision. For those conditions, although the prevalence is rare [11], other types of tests are required. If the type of color blindness could not be determined, the application will set the configuration for deuteranopia as it is the most common type of color blindness.
Based on the color test results, the camera interface will adapt to the conflicting colors. The camera interface uses the OpenCV library and instead of applying a filter to the entire image, we highlight those regions that present conflicting colors (red tones for protanopia and green tones for deuteranopia), allowing for the user to identify the objects. There are two working modes: (1) the filter mode which highlights the conflicting color regions merged with a personalized color (see Fig. 2) or (2) the gray-scale mode that modifies all the image to a gray-scale but the conflicting colors which will remain unmodified. The camera interface applies directly explainable algorithms, allowing the developers to analyze the correctness of the functioning [8, 16].
Both functioning modes work with the HSV color space (see Fig: 3 (a)) and start by selecting the conflicting colors to create a mask based on a threshold computed empirically (see Fig. 3 (b)). We then invert this mask to achieve a second mask with the non-conflicting colors (see Fig. 3 (c)). These two masks are applied on the original image to select conflicting pixels and non-conflicting pixels. Based on the mask with the conflicting colors, the filter mode applies the personalized color selected by the user (in the example the pink color is selected) and merges the colored mask with the original colors on that region (75% mask and 25% original image) (see Fig. 4 (a)). The resulting image is the combination of the original image with the conflicting colors modified with the user's selected color (see Fig. 4 (b)). In the gray-scale mode, adding the conflicting pixels mask over the gray-scale image (see Fig. 4 (c)), allows to keep the original colors only within the conflicting color regions (see Fig. 4 (d)).
The design of the app includes a limited color palette (white for the background, black for the text and lilac for the buttons) avoiding conflicting colors. Only two screens include extra colors but do not interfere with others: to highlight the errors in the color test, and to select the color filter for the conflicting colors (see Fig. 2), which includes also text.
3 USABILITY EVALUATION
In order to assess the app, we conduct a usability evaluation both with individuals with and without color blindness. The aim is to compile a global view of subjective assessments of usability (effectiveness, efficiency and satisfaction) [10] and usefullness.
3.1 Participants
31 volunteers took part in the evaluation, from which 61.2% were males (19 participants) and their ages range from 21 to 64 years old. Although not all participants presented color blindness, 6 out of 31 had this condition (19.35%): four of them presented protanopia, one deuteranopia and one was not determined.
3.2 Material
The material used for the evaluation was the app presented in this work on a Xiaomi Redmi Note 7. For the data collection, we used a questionnaire including demographic data, the System Usability Scale (SUS) [3] and extra questions to compile information on the user experience and potential improvements:
- Q1. Do you think an individual with color blindness would use this app daily? (Yes/No)
- Q2. Would you recommend this app to your acquaintances with color blindness? (Yes/No)
- Q3. Have you ever used a color correction application? If your answer is Yes, how was your experience? (Yes/No and open question)
- Q4. Do you have any recommendation to improve the user experience of this app? (open question)
3.3 Procedure
Participants used the app with no instructions the time they wished, they conducted the color blindness test and used the camera interface. The app was set with the default color for the filter. Then, they answered the demographic questionnaire, SUS and the extra questions. While answering, the interviewer was there and collected comments given by the participants.
3.4 Results
The average SUS score for all participants was of 80,57. This score positions the system above the industry average of 68 [18]. Further, as the score is over 80, it is classified as Excellent in the adjective scale, suggesting that participants found our system highly effective, efficient, and satisfying to use [2].
Regarding the questions, 84% of the participants answered Yes to Q1, and before answering some of them were already inquiring if the app was available for download. Participants commented on tasks that the app could help like shopping or selecting clothes. 100% of the participants answered that they would recommend the app to people with color blindness, regardless of their answer in the previous question. Surprisingly, only 33% of the participants with color blindness answered that they had used before a color correction application (Q3) and only 16% of the other participants had used this kind of software but just out of curiosity. Finally, suggestions to improve the application (Q4) include:
- The most repeated request was to be able to zoom in the results table of the color test, as some of the users tried to zoom in when viewing the table, especially older users.
- Another suggestion related to the color test section was to be able to move from plate to plate using the keyboard, instead of hiding it and using the graphical Next button (see Fig. 1).
- Add an option to photograph what is being captured on the screen
- The last suggestion to highlight was to create a plugin to implement in web browsers, which would perform the same functionality but with web pages instead of with the camera.
4 COMPARISON WITH ANOTHER APP
We conducted a comparison with another app that applies a filter to the entire image. The aim of this comparison was to study which strategy was more effective: apply image processing to all the image or only to the regions with conflicting colors. Therefore, we chose Spectra as the app to compare because it most closely resembles the structure and functionality of the application proposed in this work.
4.1 Participants
Seven participants with color blindness participated in the comparison with the other app. Six of them also participated in the usability evaluation. The additional user, was a male that presented protanopia.
4.2 Material
The material used for the evaluation was the app presented in this work and the Spectra app. Spectra is a commercial Android app which includes an adaption of the Farnsworth-Munsell test to identify the type of color blindness: protanopia, deuteranopia o tritanopia, and the degree. Then, according to the type and degree of color blindness, the camera interface corrects the colors. There is also the option to switch to normal mode and the option to freeze the image (see Fig. 5).
4.3 Procedure
To adequately address this comparison, our approach exclusively involved participants with color blindness. The seven participants were asked to complete the Ishihara test on two additional occasions, with each instance requiring them to view the test plates through one of the two applications. This test was done in a different day to the usability evaluation, and the app to start with was randomly selected to avoid learning effects. Both applications were set to the default values after conducting the color blind test.
The rationale behind this test is based on the premise that users with color blindness will be able to discern more numbers on the Ishihara plates using one of the applications designed to aid in color differentiation.
4.4 Results
Observing Table 1, participants present color blindness as they do not answer correctly to at least 9 plates (column Orig.). In the third column, we observe the outcomes achieved on the test when using the Spectra app. It is evident that the results improve, with the average number of correct answers increasing by 4.85 points per user (average). When using the app proposed in this work to conduct the color test, in all instances, the number of correct responses further increases (see Table 1, column App), generally by an additional 2 points (see Table 1, column Dif.).
id | Orig. | Spectra | App | Dif. |
---|---|---|---|---|
1 | 7 | 10 | 12 | 2 |
2 | 5 | 11 | 13 | 2 |
3 | 3 | 9 | 11 | 2 |
4 | 3 | 10 | 12 | 2 |
5 | 6 | 9 | 12 | 3 |
6 | 7 | 10 | 12 | 2 |
7 | 3 | 9 | 11 | 2 |
This enhancement in results can be attributed to the fact that using the applications makes it easier to distinguish between some of the color combinations. However, this improvement does not occur uniformly across all the plates. With the Spectra app, it is easier for users to read plates 1, 2, 3, 6, and 7. On the other hand, the application developed in this project significantly aids in the identification of plates 1, 6, 7, 10, 11, 12, 13, 16, and 17. As we can see, there are some plates where improvement is noted with both applications, but we next examine cases where one application proves more effective than the other.
For example, in the case of the third plate, which displays the number 29, the majority of users correctly identified the number using the Spectra application for assistance, whereas with the proposed application, a considerable number of incorrect responses were noted. As can be seen in the image (See Fig. 6), the Spectra application's filter enhances the red hues, which aids in visualizing the contours of both numbers. In the case of the application presented, the lower stroke of the 2 and the central part of the 9 are not sufficiently identified as conflicting colors, leading to outcomes such as 70 or 20 during the tests. The opposite scenario occurs with the plate containing the number 73. In this instance, none of the participants managed to correctly identify the number using the Spectra app. However, all participants were able to clearly see the number when using the application developed in this work. As depicted in Fig. 7, while using the proposed app, the number itself is not highlighted, but the background is very clearly emphasized, thus allowing for perfect readability. This same situation occurred with the plates containing the numbers 5, 7, and 16, corresponding to plates 10, 11, and 12 of the test.
5 CONCLUSION
We developed a vision based interface to help enhancing conflicting colors for people with color blindness. The approach identifies the confusing colors for red-green color blindness, and highlights them to the user in two different ways: applying a colored filter on the conflicting colors or removing color information of all zones (converting them to gray-scale) but the conflicting regions. There exist diverse types and degrees of color blindness, therefore, the existence of a broad spectrum of tools and solutions may effectively accommodate different profiles and needs, allowing for the user to choose their optimal solution.
Future lines of work include redesigning the app following Android design guidelines, the comparison of different algorithms with the proposed approach, designing for blue-yellow color blindness or developing a third “contour mode” where the contours of the conflicting regions would be highlighted.
ACKNOWLEDGMENTS
This work has been supported by Grant PID2019-104829RA-I00 funded by MCIN/ AEI /10.13039/501100011033, EXPLainable Artificial INtelligence systems for health and well-beING (EXPLAINING).
REFERENCES
- Stephanie Hagstrom Alex Melamud and Elias Traboulsi. 2004. Color vision testing. Ophthalmic Genetics 25, 3 (2004), 159–187. https://doi.org/10.1080/13816810490498341 arXiv:https://doi.org/10.1080/13816810490498341
- Aaron Bangor, Philip T. Kortum, and James T. Miller. 2009. Determining what individual SUS scores mean: adding an adjective rating scale. Journal of Usability Studies archive 4 (2009), 114–123. https://api.semanticscholar.org/CorpusID:7812093
- J. Brooke. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4–7.
- Forrest Clements. 1930. Racial differences in color-blindness. American Journal of Physical Anthropology 14, 3 (1930), 417–432. https://doi.org/10.1002/ajpa.1330140305
- Martin Douděra. 2018. NowYouSee. https://play.google.com/store/apps/details?id=com.areyoucolorblind.nowyousee&hl=en_US
- Lamiaa A. Elrefaei. 2018. Smartphone Based Image Color Correction for Color Blindness. International Journal of Interactive Mobile Technologies (iJIM) 12, 3 (Jul. 2018), pp. 104–119. https://doi.org/10.3991/ijim.v12i3.8160
- Vincent Fiorentini. 2023. Color Blind Pal. https://play.google.com/store/apps/details?id=com.colorblindpal.colorblindpal&hl=es&gl=US
- David Gunning and David W Aha. 2019. DARPA's Explainable Artificial Intelligence (XAI) Program. AI Mag. 40, 2 (2019), 44–58. https://doi.org/10.1609/aimag.v40i2.2850
- National Eye Institut. 2023. Types of Color Vision Deficiency. National Eye Institut (August 2023). https://www.nei.nih.gov/learn-about-eye-health/eye-conditions-and-diseases/color-blindness/types-color-vision-deficiency Last visit: 18 January 2024. Last updated: August 7, 2023.
- International Organization for Standardization. 2019. ISO 9241-210:2019 Ergonomics of human-system interaction Part 210: Human-centred design for interactive systems., 33 pages.
- Muhammad Waseem Iqbal, Syed Khuram Shahzad, Nadeem Ahmad, Alessia Amelio, and Darko Brodic. 2018. Adaptive interface for color-blind people in mobile-phones. In 2018 International Conference on Advancements in Computational Sciences (ICACS). 1–8. https://doi.org/10.1109/ICACS.2018.8333488
- Shinobu Ishihara. 1972. The Series of Plates Designed as a Test for Colour-Blindness. Kanehara Shuppan Co., Ltd.
- Anika Jain. 2020. Spectra: Color-Blind Assistant. https://play.google.com/store/apps/details?id=com.anikasid.spectra&hl=en_US
- Chery Lau, Nicolas Perdu, Carlos E. Rodriguez-Pardo, Sabine Süsstrunk, and Gaurav Sharma. 2015. An Interactive App for Color Deficient Viewers. Color Imaging XX: Displaying, Processing, Hardcopy, And Applications. https://doi.org/10.1117/12.2075162
- Georg Lausegger, Michael Spitzer, and Martin Ebner. 2017. OmniColor – A Smart Glasses App to Support Colorblind People. International Journal of Interactive Mobile Technologies (iJIM) 11, 5 (Jul. 2017), pp. 161–177. https://doi.org/10.3991/ijim.v11i5.6922
- Cristina Manresa-Yee, Maria Francesca Roig-Maimó, Silvia Ramis, and Ramon Mas-Sansó. 2022. Advances in XAI: Explanation Interfaces in Healthcare. Springer International Publishing, Cham, 357–369. https://doi.org/10.1007/978-3-030-83620-7_15
- Cat Pattie, Stacey Aston, and Gabriele Jordan. 2022. Do EnChroma glasses improve performance on clinical tests for red-green color deficiencies?Opt. Express 30, 18 (Aug 2022), 31872–31888. https://doi.org/10.1364/OE.456426
- J. Sauro and J.R. Lewis. 2012. Quantifying the user experience. Practical statistics for user research.
- M. P. Simunovic. 2010. Colour vision deficiency. Eye 24, 5 (01 May 2010), 747–755. https://doi.org/10.1038/eye.2009.251
- Helen A. Swarbrick, Phuong Nguyen, Tuyen Nguyen, and Phuong Pham. 2001. The ChromaGen contact lens system: colour vision test results and subjective responses. Ophthalmic and Physiological Optics 21, 3 (2001), 182–196. https://doi.org/10.1046/j.1475-1313.2001.00583.x
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org.
INTERACCION 2024, June 19–21, 2024, A Coruña, Spain
© 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.
ACM ISBN 979-8-4007-1787-1/24/06.
DOI: https://doi.org/10.1145/3657242.3658600