AtAwAR Translate: Attention-Aware Language Translation Application in Augmented Reality for Mobile Phones
Abstract
:1. Introduction
2. Related Work
2.1. AR Translation
2.2. Attention Classification
2.3. Mobile and Passive BCIs
3. The Mobile BCI-Smartphone System and Application
3.1. Translation App Implementation
3.1.1. Text Sticker Creation
3.1.2. Text Sticker Update
3.2. Brain–Computer Interface
- To reduce interference from the OS Android (the Mind Monitor should run in foreground);
- Because the AR translate app is already computationally demanding, it makes sense to not put any extra strain on the device;
- Because it enables continuous monitoring during the execution of the study, to ensure that there are no issues with the connection between Muse and Mind Monitor.
- Because the creator of Mind Monitor mentioned that the Muse Headset has connection issues with the Bluetooth module of Huawei phones, which was used for the translation app devices.
- EEG absolute values of delta, theta, alpha, beta, and gamma frequency bands as four float values (frequency ranges: (1–4 Hz), (4–8 Hz), (7.5–13 Hz), (13–30 Hz), (30–44 Hz);)
- Horseshoe indicating fit of the electrodes;
- Battery info;
- Gyroscope measuring or maintaining orientation and angular velocity;
- Accelerometer measuring acceleration;
- Blink; and
- Jaw Clench.
3.2.1. Classification
3.2.2. Prediction Integration
4. User Study
4.1. Hypotheses
4.1.1. Main Hypotheses
4.1.2. Other Hypotheses
4.2. Methods
4.2.1. Participants
4.2.2. Procedure
4.2.3. Questionnaires
4.2.4. Stimuli
Criteria for the Distractiveness of Translation Updates of Posters
Criteria for the Difficulty of Text and Question Combinations
4.3. Results
4.3.1. AtAwAR Translate Application Rating
Conclusions
4.3.2. Classification Accuracies and Pauses
Conclusions
4.3.3. SUS and NASA-TLX
Conclusions
4.3.4. Hypotheses
Conclusions
5. Discussion
5.1. Use Case
5.2. AtAwAR Translate Application
5.3. BCI Integration
5.4. Attentional State Application Adaptation
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Camera-Based AR Translation for Mobile Phones. Available online: https://translate.google.com/intl/en/about/ (accessed on 9 November 2021).
- Kimura, M.; Katayama, J.; Murohashi, H. Involvement of memory-comparison-based change detection in visual distraction. Psychophysiology 2008, 45, 445–457. [Google Scholar] [CrossRef] [PubMed]
- Jeffri, N.F.S.; Awang Rambli, D.R. A review of augmented reality systems and their effects on mental workload and task performance. Heliyon 2021, 7, e06277. [Google Scholar] [CrossRef] [PubMed]
- Chun, M.M.; Golomb, J.D.; Turk-Browne, N.B. A taxonomy of external and internal attention. Annu. Rev. Psychol. 2011, 62, 73–101. [Google Scholar] [CrossRef] [PubMed]
- Vortmann, L.M.; Kroll, F.; Putze, F. EEG-based classification of internally-and externally-directed attention in an augmented reality paradigm. Front. Hum. Neurosci. 2019, 13, 348. [Google Scholar] [CrossRef]
- Vortmann, L.M.; Putze, F. Attention-aware brain computer interface to avoid distractions in augmented reality. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–8. [Google Scholar]
- Krugliak, A.; Clarke, A. Towards real-world neuroscience using mobile EEG and augmented reality. Sci. Rep. 2022, 12, 2291. [Google Scholar] [CrossRef]
- Zhao, G.; Zhang, L.; Chu, J.; Zhu, W.; Hu, B.; He, H.; Yang, L. An Augmented Reality Based Mobile Photography Application to Improve Learning Gain, Decrease Cognitive Load, and Achieve Better Emotional State. Int. J. Hum.-Comput. Interact. 2022, 1–16. [Google Scholar] [CrossRef]
- Yan, Z.; Wu, Y.; Li, Y.; Shan, Y.; Li, X.; Hansen, P. Design Eye-Tracking Augmented Reality Headset to Reduce Cognitive Load in Repetitive Parcel Scanning Task. IEEE Trans. Hum.-Mach. Syst. 2022, 52, 578–590. [Google Scholar] [CrossRef]
- Tatwany, L.; Ouertani, H.C. A review on using augmented reality in text translation. In Proceedings of the 2017 6th International Conference on Information and Communication Technology and Accessibility (ICTA), Muscat, Oman, 19–21 December 2017; pp. 1–6. [Google Scholar]
- Fragoso, V.; Gauglitz, S.; Zamora, S.; Kleban, J.; Turk, M. TranslatAR: A mobile augmented reality translator. In Proceedings of the 2011 IEEE Workshop on Applications of Computer Vision (WACV), Kona, HI, USA, 5–7 January 2011; pp. 497–502. [Google Scholar]
- Toyama, T.; Sonntag, D.; Dengel, A.; Matsuda, T.; Iwamura, M.; Kise, K. A mixed reality head-mounted text translation system using eye gaze input. In Proceedings of the 19th International Conference on Intelligent User Interfaces, Haifa, Israel, 24–27 February 2014; pp. 329–334. [Google Scholar]
- Salvado, A.R.d.T. Augmented Reality Applied to Language Translation. Ph.D. Thesis, Universidade Nova de Lisboa, Lisbon, Portugal, 2015. [Google Scholar]
- Vortmann, L.M.; Schult, M.; Benedek, M.; Walcher, S.; Putze, F. Real-time multimodal classification of internal and external attention. In Proceedings of the Adjunct of the 2019 International Conference on Multimodal Interaction, Suzhou, China, 14–18 October 2019; pp. 1–7. [Google Scholar]
- Vortmann, L.M.; Knychalla, J.; Annerer-Walcher, S.; Benedek, M.; Putze, F. Imaging Time Series of Eye Tracking Data to Classify Attentional States. Front. Neurosci. 2021, 15, 625. [Google Scholar] [CrossRef]
- Annerer-Walcher, S.; Ceh, S.M.; Putze, F.; Kampen, M.; Körner, C.; Benedek, M. How Reliably Do Eye Parameters Indicate Internal Versus External Attentional Focus? Cogn. Sci. 2021, 45, e12977. [Google Scholar] [CrossRef]
- Cooper, N.R.; Croft, R.J.; Dominey, S.J.; Burgess, A.P.; Gruzelier, J.H. Paradox lost? Exploring the role of alpha oscillations during externally vs. internally directed attention and the implications for idling and inhibition hypotheses. Int. J. Psychophysiol. 2003, 47, 65–74. [Google Scholar] [CrossRef]
- Benedek, M.; Schickel, R.J.; Jauk, E.; Fink, A.; Neubauer, A.C. Alpha power increases in right parietal cortex reflects focused internal attention. Neuropsychologia 2014, 56, 393–400. [Google Scholar] [CrossRef] [PubMed]
- Putze, F.; Scherer, M.; Schultz, T. Starring into the void? Classifying Internal vs. External Attention from EEG. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction, Gothenburg, Sweden, 23–27 October 2016; pp. 1–4. [Google Scholar]
- Consumer-Grade EEG Headband. Available online: https://choosemuse.com/muse-2/ (accessed on 12 November 2021).
- Vasiljevic, G.A.M.; de Miranda, L.C. Brain–computer interface games based on consumer-grade EEG Devices: A systematic literature review. Int. J. Hum.-Comput. Interact. 2020, 36, 105–142. [Google Scholar] [CrossRef]
- Joselli, M.; Binder, F.; Clua, E.; Soluri, E. Concept, development and evaluation of a mind action game with the electro encephalograms as an auxiliary input. SBC J. Interact. Syst. 2017, 8, 60–73. [Google Scholar] [CrossRef]
- Cho, O.H.; Lee, W.H. BCI sensor based environment changing system for immersion of 3D game. Int. J. Distrib. Sens. Netw. 2014, 10, 620391. [Google Scholar] [CrossRef]
- Mikami, K.; Kondo, K.; Kondo, K. Adaptable Game Experience Based on Player’s Performance and EEG. In Proceedings of the 2017 Nicograph International (NicoInt), Kyoto, Japan, 2–3 June 2017; pp. 1–8. [Google Scholar]
- Mihajlović, V.; Grundlehner, B.; Vullers, R.; Penders, J. Wearable, wireless EEG solutions in daily life applications: What are we missing? IEEE J. Biomed. Health Inform. 2014, 19, 6–21. [Google Scholar] [CrossRef]
- Nam, C.S.; Nijholt, A.; Lotte, F. Brain–Computer Interfaces Handbook: Technological and Theoretical Advances; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Hwang, H.J.; Kim, S.; Choi, S.; Im, C.H. EEG-based brain-computer interfaces: A thorough literature survey. Int. J.-Hum.-Comput. Interact. 2013, 29, 814–826. [Google Scholar] [CrossRef]
- Galway, L.; McCullagh, P.; Lightbody, G.; Brennan, C.; Trainor, D. The potential of the brain-computer interface for learning: A technology review. In Proceedings of the 2015 IEEE International Conference on Computer and Information Technology; Ubiquitous Computing and Communications; Dependable, Autonomic and Secure Computing; Pervasive Intelligence and Computing, Liverpool, UK, 26–28 October 2015; pp. 1554–1559. [Google Scholar]
- Soman, S.; Srivastava, S.; Srivastava, S.; Rajput, N. Brain computer interfaces for mobile apps: State-of-the-art and future directions. arXiv 2015, arXiv:1509.01338. [Google Scholar]
- Krigolson, O.E.; Williams, C.C.; Norton, A.; Hassall, C.D.; Colino, F.L. Choosing MUSE: Validation of a low-cost, portable EEG system for ERP research. Front. Neurosci. 2017, 11, 109. [Google Scholar] [CrossRef]
- Zgallai, W.; Brown, J.T.; Ibrahim, A.; Mahmood, F.; Mohammad, K.; Khalfan, M.; Mohammed, M.; Salem, M.; Hamood, N. Deep Learning AI Application to an EEG driven BCI Smart Wheelchair. In Proceedings of the 2019 Advances in Science and Engineering Technology International Conferences (ASET), Dubai, United Arab Emirates, 26 March–10 April 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Zander, T.O.; Kothe, C.; Jatzev, S.; Gaertner, M. Enhancing human-computer interaction with input from active and passive brain-computer interfaces. In Brain-Computer Interfaces; Springer: Berlin/Heidelberg, Germany, 2010; pp. 181–199. [Google Scholar]
- Aricò, P.; Borghini, G.; Di Flumeri, G.; Sciaraffa, N.; Babiloni, F. Passive BCI beyond the lab: Current trends and future directions. Physiol. Meas. 2018, 39, 08TR02. [Google Scholar] [CrossRef]
- Roy, R.N.; Frey, J. Neurophysiological markers for passive brain–computer interfaces. In Brain–Computer Interfaces 1: Foundations and Methods; ISTE: London, UK, 2016; pp. 85–100. [Google Scholar]
- Zander, T.O.; Andreessen, L.M.; Berg, A.; Bleuel, M.; Pawlitzki, J.; Zawallich, L.; Krol, L.R.; Gramann, K. Evaluation of a dry EEG system for application of passive brain-computer interfaces in autonomous driving. Front. Hum. Neurosci. 2017, 11, 78. [Google Scholar] [CrossRef]
- PaddleOCR. lightweight Optical Character Recognition Using Neural Networks. Available online: https://github.com/PaddlePaddle/PaddleOCR (accessed on 12 November 2021).
- Du, Y.; Li, C.; Guo, R.; Yin, X.; Liu, W.; Zhou, J.; Bai, Y.; Yu, Z.; Yang, Y.; Dang, Q.; et al. PP-OCR: A practical ultra lightweight OCR system. arXiv 2020, arXiv:2009.09941. [Google Scholar]
- Kaehler, A.; Bradski, G. Learning OpenCV 3: Computer Vision in C++ with the OpenCV Library; O’Reilly Media, Inc.: Newton, MA, USA, 2016. [Google Scholar]
- mindmonitor. Real Time EEG Graphs from Your Interaxon Muse Headband. Available online: https://mind-monitor.com/ (accessed on 12 November 2021).
- JavaOSC. Library That Gives JVM Language Programs the Ability of Working with OSC Content Format. Available online: https://github.com/hoijui/JavaOSC (accessed on 12 November 2021).
- Muse Headset. Technical Specifications, Validation, and Research Use. Available online: https://images-na.ssl-images-amazon.com/images/I/D1RREdoENNS.pdf (accessed on 12 November 2021).
- Conrad, C.D.; Bliemel, M. Psychophysiological measures of cognitive absorption and cognitive load in e-learning applications. In Proceedings of the Thirty Sixth International Conference on Information Systems, Dublin, Ireland, 11–14 December 2016; p. 9. [Google Scholar]
- Przegalinska, A.; Ciechanowski, L.; Magnuski, M.; Gloor, P. Muse headband: Measuring tool or a collaborative gadget? In Collaborative Innovation Networks; Springer: Berlin/Heidelberg, Germany, 2018; pp. 93–101. [Google Scholar]
- Vortmann, L.M.; Putze, F. Exploration of Person-Independent BCIs for Internal and External Attention-Detection in Augmented Reality. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 1–27. [Google Scholar] [CrossRef]
- MuseIO. Muse for Developers. Available online: https://web.archive.org/web/20181105231756/http://developer.choosemuse.com/tools/available-data#Absolute_Band_Powers (accessed on 12 November 2021).
- Tan, D.; Nijholt, A. Brain-computer interfaces and human-computer interaction. In Brain-Computer Interfaces; Springer: Berlin/Heidelberg, Germany, 2010; pp. 3–19. [Google Scholar]
- Mrazek, M.D.; Phillips, D.T.; Franklin, M.S.; Broadway, J.M.; Schooler, J.W. Young and restless: Validation of the Mind-Wandering Questionnaire (MWQ) reveals disruptive impact of mind-wandering for youth. Front. Psychol. 2013, 4, 560. [Google Scholar] [CrossRef] [PubMed]
- Brooke, J. SUS-A quick and dirty usability scale. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
- Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in Psychology; Elsevier: Amsterdam, The Netherlands, 1988; Volume 52, pp. 139–183. [Google Scholar]
- Hart, S.G. NASA-task load index (NASA-TLX); 20 years later. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting; Sage Publications Sage CA: Los Angeles, CA, USA, 2006; Volume 50, pp. 904–908. [Google Scholar]
- Rummel, B. System Usability Scale—Jetzt Auch auf Deutsch. Available online: https://blogs.sap.com/2016/02/01/system-usability-scale-jetzt-auch-auf-deutsch/ (accessed on 12 November 2021).
- NASA-TLX. (Kurzfassung Deutsch). Available online: http://interaction-design-group.de/toolbox/wp-content/uploads/2016/05/NASA-TLX.pdf (accessed on 12 November 2021).
- von Mühlenen, A.; Rempel, M.I.; Enns, J.T. Unique temporal change is the key to attentional capture. Psychol. Sci. 2005, 16, 979–986. [Google Scholar] [CrossRef]
- Culham, J. Attention-grabbing motion in the human brain. Neuron 2003, 40, 451–452. [Google Scholar] [CrossRef]
- Bangor, A.; Kortum, P.; Miller, J. Determining what individual SUS scores mean: Adding an adjective rating scale. J. Usability Stud. 2009, 4, 114–123. [Google Scholar]
Text Replace | |||||||
---|---|---|---|---|---|---|---|
App Name | Author | Release | Downloads | Langs. | RT | AR | Text Display |
Google Translate AR | 2015 | 500M+ | 109 | yes | yes | Replacement | |
Microsoft Translator | Microsoft | 2015 | 50M+ | 22 | (yes) | Overlaid | |
Camera Translator | EVOLLY.APP | 2017 | 10M+ | 56 | (yes) | Overlaid | |
Camera Translator | Fox Solution | 2018 | 1M+ | 163 | Separate | ||
Translator for Texts, Websites & Photos | Octaviassil | 2018 | 1M+ | 108 | Separate | ||
Cam Translate | Xiaoling App | 2019 | 100k+ | 28 | Separate | ||
Language Translator | Touchpedia | 2021 | 50k+ | 105 | Separate |
NASA-TLX Category | With BCI | Without BCI | Difference | ||
---|---|---|---|---|---|
M | SD | M | SD | ||
Mental Demand | 5.75 | 2.3 | 6.5 | 1.57 | −0.75 |
Physical Demand | 2.33 | 1.15 | 2.33 | 1.3 | 0 |
Temporal Demand | 2.83 | 2.4 | 1.67 | 0.78 | 1.17 |
Performance | 5.25 | 2.45 | 5.75 | 2.22 | −0.5 |
Effort | 6 | 2.22 | 5.41 | 1.72 | 0.58 |
Frustration | 4.92 | 2.78 | 4.5 | 2.35 | 0.41 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vortmann, L.-M.; Weidenbach, P.; Putze, F. AtAwAR Translate: Attention-Aware Language Translation Application in Augmented Reality for Mobile Phones. Sensors 2022, 22, 6160. https://doi.org/10.3390/s22166160
Vortmann L-M, Weidenbach P, Putze F. AtAwAR Translate: Attention-Aware Language Translation Application in Augmented Reality for Mobile Phones. Sensors. 2022; 22(16):6160. https://doi.org/10.3390/s22166160
Chicago/Turabian StyleVortmann, Lisa-Marie, Pascal Weidenbach, and Felix Putze. 2022. "AtAwAR Translate: Attention-Aware Language Translation Application in Augmented Reality for Mobile Phones" Sensors 22, no. 16: 6160. https://doi.org/10.3390/s22166160
APA StyleVortmann, L.-M., Weidenbach, P., & Putze, F. (2022). AtAwAR Translate: Attention-Aware Language Translation Application in Augmented Reality for Mobile Phones. Sensors, 22(16), 6160. https://doi.org/10.3390/s22166160