Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3563657.3596121acmconferencesArticle/Chapter ViewAbstractPublication PagesdisConference Proceedingsconference-collections
research-article
Open access

Improving and Analyzing Sketchy High-Fidelity Free-Eye Drawing

Published: 10 July 2023 Publication History

Abstract

Some people with a motor disability that limits hand movements use technology to draw via their eyes. Free-eye drawing has been re-investigated recently and yielded state-of-the-art results via unimodal gaze control. However, limitations remain, including limited functions, conflicts between observation and drawing, and the brush tailing issue. We introduce a professional unimodal gaze control free-eye drawing application and improve upon free-eye drawing by extended gaze-based user interface functions, improved brush dynamics, and a double-blink gaze gesture. An experiment and a field study were conducted to assess the system’s usability compared to the mainstream gaze-control drawing method and hand drawing and the accessibility among users with motor disabilities. The results showed that the application provides efficient interaction and the ability to create hand-sketch-level graphics for people with motor disabilities. Herein, we contribute a robust and professional free-eye drawing application, detailing valuable design considerations for future developments in gaze interaction.

References

[1]
Aaron Bangor, Philip Kortum, and James Miller. 2009. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies 4, 3 (2009), 114–123.
[2]
Aaron Bangor, Philip T Kortum, and James T Miller. 2008. An empirical evaluation of the system usability scale. Intl. Journal of Human–Computer Interaction 24, 6 (2008), 574–594.
[3]
Glynis M Breakwell, Sean Ed Hammond, Chris Ed Fife-Schaw, and Jonathan A Smith. 2006. Research methods in psychology. Sage Publications, Inc.
[4]
Gary Charness, Uri Gneezy, and Michael A Kuhn. 2012. Experimental methods: Between-subject and within-subject design. Journal of economic behavior & organization 81, 1 (2012), 1–8.
[5]
Chris Creed, Maite Frutos-Pascual, and Ian Williams. 2020. Multimodal Gaze Interaction for Creative Design. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.
[6]
Michael J Doughty. 2001. Consideration of three types of spontaneous eyeblink activity in normal humans: during reading and video display terminal use, in primary gaze, and while in conversation. Optometry and Vision Science 78, 10 (2001), 712–725.
[7]
Andrew T Duchowski and Andrew T Duchowski. 2017. Eye tracking methodology: Theory and practice. Springer.
[8]
SHANE FRED Erick and Baruch Fischhoff. 1998. Scope (in) sensitivity in elicited valuations. Risk Decision and Policy 3, 2 (1998), 109–123.
[9]
J Gips and P Olivieri. 1996. An eye control system for persons with disabilities. In COMPUTER SCIENCE DEPARTMENT. The Eleventh International Conference on Technology and Persons with Disabilities. Los Angeles, California.
[10]
John Paulin Hansen, Vijay Rajanna, I Scott MacKenzie, and Per Bækgaard. 2018. A Fitts’ law study of click and dwell interaction by gaze, head and mouse with a head-mounted display. In Proceedings of the Workshop on Communication by Gaze Interaction. 1–5.
[11]
Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in psychology. Vol. 52. Elsevier, 139–183.
[12]
Mary Hayhoe and Dana Ballard. 2005. Eye movements in natural behavior. Trends in cognitive sciences 9, 4 (2005), 188–194.
[13]
Henna Heikkilä. 2013. Eyesketch: a drawing application for gaze control. In Proceedings of the 2013 Conference on Eye Tracking South Africa. 71–74.
[14]
Henna Heikkilä. 2013. Tools for a gaze-controlled drawing application–comparing gaze gestures against dwell buttons. In IFIP Conference on Human-Computer Interaction. Springer, 187–201.
[15]
Anthony J Hornof and Anna Cavender. 2005. EyeDraw: enabling children with severe motor impairments to draw with their eyes. In Proceedings of the SIGCHI conference on Human factors in computing systems. 161–170.
[16]
Lida Huang, Thomas Westin, Mirjam Palosaari Eladhari, Sindri Magnússon, and Hao Chen. 2023. Eyes can draw: A high-fidelity free-eye drawing method with unimodal gaze control. International Journal of Human-Computer Studies 170 (2023), 102966.
[17]
Lida Huang, Chaomei Xu, Thomas Westin, Jerome Dupire, Florian Le Lièvre, and Xueting Shi. 2022. A Study of the Challenges of Eye Tracking Systems and Gaze Interaction for Individuals with Motor Disabilities. In International Conference on Human-Computer Interaction. Springer, 396–411.
[18]
Robert JK Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 152–169.
[19]
Robert JK Jacob. 1995. Eye tracking in advanced interface design. Virtual environments and advanced interface design 258 (1995), 288.
[20]
Minjee Kim, Daehwan Jin, Ilsun Rhiu, and Myung Hwan Yun. 2018. The effect of stimulus size and position on the task performance of an eye mouse: comparing blink and dwell methods in a click task. International Journal of Human–Computer Interaction 34, 7 (2018), 603–620.
[21]
Satoshi Kosugi and Toshihiko Yamasaki. 2020. Unpaired image enhancement featuring reinforcement-learning-controlled image editing software. In Proceedings of the AAAI conference on artificial intelligence, Vol. 34. 11296–11303.
[22]
Luka Krapic, Kristijan Lenac, and Sandi Ljubic. 2015. Integrating Blink Click interaction into a head tracking system: implementation and usability issues. Universal Access in the Information Society 14, 2 (2015), 247–264.
[23]
Aleksandra Królak and Paweł Strumiłło. 2012. Eye-blink detection system for human–computer interaction. Universal Access in the Information Society 11, 4 (2012), 409–419.
[24]
Kyung-Ah Kwon, Rebecca J Shipley, Mohan Edirisinghe, Daniel G Ezra, Geoff Rose, Serena M Best, and Ruth E Cameron. 2013. High-speed camera characterization of voluntary eye blinking kinematics. Journal of the Royal Society Interface 10, 85 (2013), 20130227.
[25]
Hanyuan Liu, Chengze Li, Xueting Liu, and Tien-Tsin Wong. 2022. End-to-End Line Drawing Vectorization. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 36. 4559–4566.
[26]
Ziqi Lu. 2022. Digital Image Art Style Transfer Algorithm and Simulation Based on Deep Learning Model. Scientific Programming 2022 (2022).
[27]
Raphael Menges, Chandan Kumar, and Steffen Staab. 2019. Improving user experience of eye tracking-based interaction: Introspecting and adapting interfaces. ACM Transactions on Computer-Human Interaction (TOCHI) 26, 6 (2019), 1–46.
[28]
André Meyer and Markus Dittmar. 2009. Conception and development of an accessible application for producing images by gaze interaction-EyeArt.
[29]
Ron Milo, Paul Jorgensen, Uri Moran, Griffin Weber, and Michael Springer. 2010. BioNumbers—the database of key numbers in molecular and cell biology. Nucleic acids research 38, suppl_1 (2010), D750–D753.
[30]
Yoshiko Nakamura, Jumpei Matsuda, Kazutaka Suzuki, Haruyoshi Toyoda, Naotoshi Hakamata, Takasumi Shimamoto, and Shigeru Kinoshita. 2008. Measurement of spontaneous blinks with a high-speed blink analyzing system. Nippon Ganka Gakkai Zasshi 112, 12 (2008), 1059–1067.
[31]
Robert Snowden, Robert J Snowden, Peter Thompson, and Tom Troscianko. 2012. Basic vision: an introduction to visual perception. Oxford University Press.
[32]
Annalena Streichert, Katrin Angerbauer, Magdalena Schwarzl, and Michael Sedlmair. 2020. Comparing Input Modalities for Shape Drawing Tasks. In ACM Symposium on Eye Tracking Research and Applications. 1–5.
[33]
J Tchalenko. 2001. Free-eye drawing. Point: Art and Design Research Journal 11 (2001), 36–41.
[34]
John Tchalenko. 2007. Eye movements in drawing simple lines. Perception 36, 8 (2007), 1152–1167.
[35]
John Tchalenko, Se-Ho Nam, Moshe Ladanga, and R Chris Miall. 2014. The gaze-shift strategy in drawing. Psychology of Aesthetics, Creativity, and the Arts 8, 3 (2014), 330.
[36]
Jan Van der Kamp and Veronica Sundstedt. 2011. Gaze and voice controlled drawing. In Proceedings of the 1st conference on novel gaze-controlled applications. 1–8.
[37]
Frans VanderWerf, Peter Brassinga, Dik Reits, Majid Aramideh, and Bram Ongerboer de Visser. 2003. Eyelid movements: behavioral studies of blinking in humans under different stimulus conditions. Journal of neurophysiology 89, 5 (2003), 2784–2796.
[38]
Jacob O Wobbrock. 2006. EdgeWrite: A versatile design for text entry and control.
[39]
Shaoyao Zhang, Yu Tian, Chunhui Wang, and Kunlin Wei. 2020. Target selection by gaze pointing and manual confirmation: performance improved by locking the gaze cursor. Ergonomics 63, 7 (2020), 884–895.

Cited By

View all
  • (2024)I see an IC: A Mixed-Methods Approach to Study Human Problem-Solving Processes in Hardware Reverse EngineeringProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642837(1-20)Online publication date: 11-May-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
DIS '23: Proceedings of the 2023 ACM Designing Interactive Systems Conference
July 2023
2717 pages
ISBN:9781450398930
DOI:10.1145/3563657
This work is licensed under a Creative Commons Attribution International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 July 2023

Check for updates

Author Tags

  1. Double Blink
  2. Eye Tracking
  3. Free-Eye Drawing
  4. Gaze Control
  5. Usability

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

DIS '23
Sponsor:
DIS '23: Designing Interactive Systems Conference
July 10 - 14, 2023
PA, Pittsburgh, USA

Acceptance Rates

Overall Acceptance Rate 1,158 of 4,684 submissions, 25%

Upcoming Conference

DIS '25
Designing Interactive Systems Conference
July 5 - 9, 2025
Funchal , Portugal

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)209
  • Downloads (Last 6 weeks)31
Reflects downloads up to 25 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)I see an IC: A Mixed-Methods Approach to Study Human Problem-Solving Processes in Hardware Reverse EngineeringProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642837(1-20)Online publication date: 11-May-2024

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media