Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

HoloAAC: A Mixed Reality AAC Application for People with Expressive Language Difficulties

  • Conference paper
  • First Online:
Virtual, Augmented and Mixed Reality (HCII 2024)

Abstract

We present a novel AAC application, HoloAAC, based on mixed reality that helps people with expressive language difficulties communicate in grocery shopping scenarios via a mixed reality device. A user, who has difficulty in speaking, can easily convey their intention by pressing a few buttons. Our application uses computer vision techniques to automatically detect grocery items, helping the user quickly locate the items of interest. In addition, our application uses natural language processing techniques to categorize the sentences to help the user quickly find the desired sentence. We evaluate our mixed reality-based application on AAC users and compare its efficacy with traditional AAC applications. HoloAAC contributed to the early exploration of context-aware AR-based AAC applications and provided insights for future research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://students.washington.edu/bhimar/highlights/2020-12-18-GrocerEye/.

  2. 2.

    https://www.asha.org/njc/aac/.

  3. 3.

    https://www.assistiveware.com/products/proloquo2go.

References

  1. Al-Kassim, Z., Memon, Q.A.: Designing a low-cost eyeball tracking keyboard for paralyzed people. Comput. Electr. Eng. 58, 20–29 (2017)

    Article  Google Scholar 

  2. Al-Rahayfeh, A., Faezipour, M.: Eye tracking and head movement detection: a state-of-art survey. IEEE J. Transl. Eng. Health Med. 1, 2100212 (2013)

    Article  Google Scholar 

  3. Bai, Z., Blackwell, A., Coulouris, G.: Using augmented reality to elicit pretend play for children with autism. IEEE Trans. Vis. Comput. Graph. 21, 598–610 (2015). https://doi.org/10.1109/TVCG.2014.2385092

  4. Bennett, C.L., Rosner, D.K.: The promise of empathy: design, disability, and knowing the “other”. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2019)

    Google Scholar 

  5. Beukelman, D.R., Mirenda, P., et al.: Augmentative and Alternative Communication. Paul H Brookes, Baltimore (1998)

    Google Scholar 

  6. Caine, K.: Local standards for sample size at CHI. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 981–992 (2016)

    Google Scholar 

  7. Chan, R.Y.Y., Bai, X., Chen, X., Jia, S., Xu, X.h.: IBeacon and HCI in special education: micro-location based augmentative and alternative communication for children with intellectual disabilities. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, CHI EA 2016, pp. 1533–1539. Association for Computing Machinery, New York (2016)

    Google Scholar 

  8. Chan, R.Y.Y., Sato-Shimokawara, E., Bai, X., Yukiharu, M., Kuo, S.W., Chung, A.: A context-aware augmentative and alternative communication system for school children with intellectual disabilities. IEEE Syst. J. 14, 208–219 (2020)

    Article  Google Scholar 

  9. Chen, C.H., Lee, I.J., Lin, L.Y.: Augmented reality-based self-facial modeling to promote the emotional expression and social skills of adolescents with autism spectrum disorders. Res. Dev. Disabil. 36, 396–403 (2015). https://doi.org/10.1016/j.ridd.2014.10.015

    Article  Google Scholar 

  10. Chen, C.H., Lee, I.J., Lin, L.Y.: Augmented reality-based video-modeling storybook of nonverbal facial cues for children with autism spectrum disorder to improve their perceptions and judgments of facial expressions and emotions. Comput. Hum. Behav. 55, 477–485 (2016)

    Article  Google Scholar 

  11. Cihak, D.F., Moore, E.J., Wright, R.E., McMahon, D.D., Gibbons, M.M., Smith, C.: Evaluating augmented reality to complete a chain task for elementary students with autism. J. Spec. Educ. Technol. 31(2), 99–108 (2016). https://doi.org/10.1177/0162643416651724

    Article  Google Scholar 

  12. DongGyu, P., Song, S., Lee, D.: Smart phone-based context-aware augmentative and alternative communications system. J. Central South Univ. 21, 3551–3558 (2014). https://doi.org/10.1007/s11771-014-2335-3

  13. Fiannaca, A., Paradiso, A., Shah, M., Morris, M.R.: AACrobat: using mobile devices to lower communication barriers and provide autonomy with gaze-based AAC. In: Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, pp. 683–695 (2017)

    Google Scholar 

  14. Ghatkamble, R., Son, J., Park, D.: A design and implementation of smartphone-based AAC system. J. Korea Inst. Inf. Commun. Eng. 18(8), 1895–1903 (2014)

    Google Scholar 

  15. Gibson, R.C., Dunlop, M.D., Bouamrane, M.M., Nayar, R.: Designing clinical AAC tablet applications with adults who have mild intellectual disabilities. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2020)

    Google Scholar 

  16. Hart, S.G.: Nasa task load index (TLX) (1986)

    Google Scholar 

  17. Hayden, C.M., et al.: Augmented reality for speech and language intervention in autism spectrum disorder. Ph.D. thesis (2017)

    Google Scholar 

  18. Hofmann, M., Kasnitz, D., Mankoff, J., Bennett, C.L.: Living disability theory: reflections on access, research, and design. In: Proceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility, pp. 1–13 (2020)

    Google Scholar 

  19. Hung, S.W., Chang, C.W., Ma, Y.C.: A new reality: exploring continuance intention to use mobile augmented reality for entertainment purposes. Technol. Soc. 67, 101757 (2021)

    Article  Google Scholar 

  20. Jen, C.L., Chen, Y.L., Lin, Y.J., Lee, C.H., Tsai, A., Li, M.T.: Vision based wearable eye-gaze tracking system. In: 2016 IEEE International Conference on Consumer Electronics (ICCE), pp. 202–203. IEEE (2016)

    Google Scholar 

  21. Kane, S.K., Linam-Church, B., Althoff, K., McCall, D.: What we talk about: Designing a context-aware communication tool for people with aphasia. In: Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2012, pp. 49–56. Association for Computing Machinery, New York (2012). https://doi.org/10.1145/2384916.2384926

  22. Kerdvibulvech, C., Wang, C.-C.: A new 3D augmented reality application for educational games to help children in communication interactively. In: Gervasi, O., et al. (eds.) ICCSA 2016. LNCS, vol. 9787, pp. 465–473. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-42108-7_35

    Chapter  Google Scholar 

  23. Klaib, A.F., Alsrehin, N.O., Melhem, W.Y., Bashtawi, H.O., Magableh, A.A.: Eye tracking algorithms, techniques, tools, and applications with an emphasis on machine learning and internet of things technologies. Expert Syst. Appl. 166, 114037 (2021). https://doi.org/10.1016/j.eswa.2020.114037. https://www.sciencedirect.com/science/article/pii/S0957417420308071

  24. Krishnamurthy, D., Jaswal, V., Nazari, A., Shahidi, A., Subbaraman, P., Wang, M.: Holotype: lived experience based communication training for nonspeaking autistic people. In: CHI Conference on Human Factors in Computing Systems Extended Abstracts, pp. 1–6 (2022)

    Google Scholar 

  25. Kristensson, P.O., Lilley, J., Black, R., Waller, A.: A design engineering approach for quantitatively exploring context-aware sentence retrieval for nonspeaking individuals with motor disabilities. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–11 (2020)

    Google Scholar 

  26. Liu, R., Salisbury, J., Vahabzadeh, A., Sahin, N.: Feasibility of an autism-focused augmented reality smartglasses system for social communication and behavioral coaching. Front. Pediatr. 5 (2017). https://doi.org/10.3389/fped.2017.00145

  27. Mcmahon, D., Cihak, D., Wright, R., Bell, S.: Augmented reality for teaching science vocabulary to postsecondary education students with intellectual disabilities and autism. J. Res. Technol. Educ. 48, 1–19 (2015). https://doi.org/10.1080/15391523.2015.1103149

  28. Mitchell, C., et al.: Ability-based keyboards for augmentative and alternative communication: Understanding how individuals’ movement patterns translate to more efficient keyboards: Methods to generate keyboards tailored to user-specific motor abilities. In: Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems, CHI EA 2022. Association for Computing Machinery, New York (2022). https://doi.org/10.1145/3491101.3519845

  29. Mystakidis, S., Christopoulos, A., Pellas, N.: A systematic mapping review of augmented reality applications to support stem learning in higher education. Educ. Inf. Technol. 27(2), 1883–1927 (2022)

    Article  Google Scholar 

  30. Obiorah, M.G., Piper, A.M.M., Horn, M.: Designing AACS for people with aphasia dining in restaurants. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–14 (2021)

    Google Scholar 

  31. Panchanathan, S., Moore, M., Venkateswara, H., Chakraborty, S., McDaniel, T.: Computer vision for augmentative and alternative communication. In: Computer Vision for Assistive Healthcare, pp. 211–248. Elsevier (2018)

    Google Scholar 

  32. Plasson, C., Cunin, D., Laurillau, Y., Nigay, L.: 3d tabletop AR: a comparison of mid-air, touch and touch+ mid-air interaction. In: Proceedings of the International Conference on Advanced Visual Interfaces, pp. 1–5 (2020)

    Google Scholar 

  33. Porter, G., Kirkland, J., Spastic Society of Victoria: Integrating Augmentative and Alternative Communication Into Group Programs: Utilising the Principles of Conductive Education. Spastic Society of Victoria (1995). https://books.google.com/books?id=weYGPQAACAAJ

  34. Ramires Fernandes, A., Almeida da Silva, C., Grohmann, A.: Assisting speech therapy for autism spectrum disorders with an augmented reality application, vol. 3, November 2014

    Google Scholar 

  35. Raudonis, V., Simutis, R., Narvydas, G.: Discrete eye tracking for medical applications. In: 2009 2nd International Symposium on Applied Sciences in Biomedical and Communication Technologies, pp. 1–6 (2009). https://doi.org/10.1109/ISABEL.2009.5373675

  36. Rauschnabel, P.A., Felix, R., Hinsch, C.: Augmented reality marketing: how mobile AR-apps can improve brands through inspiration. J. Retail. Consum. Serv. 49, 43–53 (2019)

    Article  Google Scholar 

  37. Rocha, A.P., et aloward supporting communication for people with aphasia: the in-bed scenario. In: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI 2022. Association for Computing Machinery, New York (2022). https://doi.org/10.1145/3528575.3551431

  38. Sezer, O.B., Dogdu, E., Ozbayoglu, A.M.: Context-aware computing, learning, and big data in internet of things: a survey. IEEE Internet Things J. 5(1), 1–27 (2018). https://doi.org/10.1109/JIOT.2017.2773600

    Article  Google Scholar 

  39. Shane, H.C., Blackstone, S., Vanderheiden, G., Williams, M., DeRuyter, F.: Using AAC technology to access the world. Assist. Technol. 24(1), 3–13 (2012)

    Article  Google Scholar 

  40. Shen, J., Yang, B., Dudley, J.J., Kristensson, P.O.: KWickChat: a multi-turn dialogue system for AAC using context-aware sentence generation by bag-of-keywords. In: 27th International Conference on Intelligent User Interfaces, pp. 853–867, IUI 2022. Association for Computing Machinery, New York (2022). https://doi.org/10.1145/3490099.3511145

  41. Sobel, K., et al.: Exploring the design space of AAC awareness displays. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 2890–2903 (2017)

    Google Scholar 

  42. Tzima, S., Styliaras, G., Bassounas, A.: Augmented reality applications in education: teachers point of view. Educ. Sci. 9(2), 99 (2019)

    Article  Google Scholar 

  43. Valencia, S., Cave, R., Kallarackal, K., Seaver, K., Terry, M., Kane, S.K.: “The less i type, the better”: how AI language models can enhance or impede communication for AAC users. In: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, pp. 1–14 (2023)

    Google Scholar 

  44. Wang, W., Lei, S., Liu, H., Li, T., Qu, J., Qiu, A.: Augmented reality in maintenance training for military equipment. In: Journal of Physics: Conference Series. vol. 1626, p. 012184. IOP Publishing (2020)

    Google Scholar 

  45. Zhang, X., Kulkarni, H., Morris, M.R.: Smartphone-based gaze gesture communication for people with motor disabilities. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 2878–2889 (2017)

    Google Scholar 

  46. Zhao, H., Karlsson, P., Kavehei, O., McEwan, A.: Augmentative and alternative communication with eye-gaze technology and augmented reality: reflections from engineers, people with cerebral palsy and caregivers. In: 2021 IEEE Sensors, pp. 1–4 (2021). https://doi.org/10.1109/SENSORS47087.2021.9639819

Download references

Acknowledgments

We are grateful to the participants for their feedback on our application. This project was supported by NSF grants (award numbers: 1942531 and 2128867).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Liuchuan Yu .

Editor information

Editors and Affiliations

Ethics declarations

Disclosure of Interests

The authors have no competing interests to declare that are relevant to the content of this article.

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yu, L. et al. (2024). HoloAAC: A Mixed Reality AAC Application for People with Expressive Language Difficulties. In: Chen, J.Y.C., Fragomeni, G. (eds) Virtual, Augmented and Mixed Reality. HCII 2024. Lecture Notes in Computer Science, vol 14708. Springer, Cham. https://doi.org/10.1007/978-3-031-61047-9_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-61047-9_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-61046-2

  • Online ISBN: 978-3-031-61047-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics