Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1240624.1240836acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Earpod: eyes-free menu selection using touch input and reactive audio feedback

Published: 29 April 2007 Publication History
  • Get Citation Alerts
  • Abstract

    We present the design and evaluation of earPod: an eyes-free menu technique using touch input and reactive auditory feedback. Studies comparing earPod with an iPod-like visual menu technique on reasonably-sized static menus indicate that they are comparable in accuracy. In terms of efficiency (speed), earPod is initially slower, but outperforms the visual technique within 30 minutes of practice. Our results indicate that earPod is potentially a reasonable eyes-free menu technique for general use, and is a particularly exciting technique for use in mobile device interfaces.

    References

    [1]
    KidsClick! web search for kids by librarians: http://sunsite.berkeley.edu/KidsClick!/.
    [2]
    Wikipedia: http://www.wikipedia.org/.
    [3]
    Barry, A. (1997). SpeechSkimmer: a system for interactively skimming recorded speech. ACM Transactions on Computer-Human Interaction, 4(1). p. 3--38.
    [4]
    Bernstein, L. (1997). Detection and discrimination of interaural disparities: Modern earphone-based studies, in Binaural and Spatial Hearing in Real and Virtual Environments, R.H. Gilkey et. al. Editors. Lawrence Erlbaum. p. 117--138.
    [5]
    Berry, G. (1997). The Esterel Primer. Technical report, Ecole des Mines de Paris and INRIA.
    [6]
    Blattner, M., Sumikawa, D., and Greenberg, R. (1989). Earcons and icons: Their structure and common design principles. Human Computer Interaction, 4(1). p. 11--44.
    [7]
    Brewster, S., Lumsden, J., Bell, M., Hall, M., and Tasker, S. (2003). Multimodal 'eyes-free' interaction techniques for wearable devices. ACM CHI Conference on Human Factors in Computing Systems. p. 473--480.
    [8]
    Brewster, S. (1998). Using nonspeech sounds to provide navigation cues. ACM Transactions on Computer-Human Interaction. 5(3). p. 224--259.
    [9]
    Brewster, S., and Cryer, P. (1999). Maximising Screen Space on Mobile Computing Devices. ACM CHI Conference on Human Factors in Computing Systems (Extended Abstracts).p. 224--225.
    [10]
    Brewster, S., Wright, P., and Edwards, A. (1993). An evaluation of earcons for use in auditory human-computer interfaces. ACM CHI Conference on Human Factors in Computing Systems. p. 222--227.
    [11]
    Callahan, J., Hopkins, D., Weiser, M. and Shneiderman, B. (1988). An empirical comparison of pie vs. linear menus. ACM CHI Conference on Human Factors in Computing Systems. p. 95--100.
    [12]
    Doel, K., and Pai, D. (2001). A Java audio synthesis system for programmers. International Conference on Auditory Display.
    [13]
    Gaver, W. (1989). The Sonic Finder: An interface that uses auditory icons. Human Computer Interaction, 4(1). p. 67--94.
    [14]
    Gaver, W., and Smith, R. (1991). Auditory icons in large-scale collaborative environments. ACM SIGCHI Bulletin, 23(1). p. 96.
    [15]
    Guo, J., and Guo, A. (2005). Crossmodal interaction between olfactory and visual learning in Drosophila. Science, 309. p. 307--310.
    [16]
    Harrison, B., Fishkin, K., Gujar, A., Mochon, C., and Want, R. (1998). Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces. ACM CHI Conference on Human Factors in Computing Systems. p. 17--24.
    [17]
    Kurtenbach, G. (1993). The design and evaluation of marking menus. Ph.D. Thesis, University of Toronto.
    [18]
    Liao, C., Guimbretiere, F., and Loeckenhoff, C. (2006). Pen-top feedback for paper-based interfaces. ACM UIST Symposium on User Interface Software and Technology. p. 201--210.
    [19]
    Luk, J., Pasquero, J., Little, J., MacLean, K., Levesque, V., Hayward, V. (2006). A role for haptics in mobile interaction: initial design using a handheld tactile display prototype. ACM CHI Conference on Human Factors in Computing Systems. p. 171--180.
    [20]
    Marics, M., and Engelbeck, G. (1997). Designing voice menu applications for telephones, in Handbook of Human-Computer Interaction, Helander, M., Landauer, T., and Prabhu, P. Editors. Elsevier. p. 1085--1102.
    [21]
    Miller, D. (1981). The depth/breadth tradeoff in hierarchical computer menus. Human Factors Society Conference. p. 296--300.
    [22]
    Minoru, K., Chris, S. (1997). Dynamic Soundscape: mapping time to space for audio browsing. ACM CHI Conference on Human Factors in Computing Systems. p. 194--201.
    [23]
    Mynatt, E. (1995). Transforming graphical interfaces into auditory interfaces. ACM CHI Conference Companion on Human Factors in Computing Systems. p. 67--68.
    [24]
    Norman, K. (1991). The Psychology of Menu Selection: Designing Cognitive Control at the Human/Computer Interface. Ablex Publishing Corporation.
    [25]
    Oulasvirta, A., Tamminen, S., Roto, V., and Kuorelahti, J. (2005). Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI. ACM CHI Conference on Human Factors in Computing Systems. p. 919--928.
    [26]
    Paap, K., and Cooke, N. (1997). Designing menus, in Handbook of Human-Computer Interaction, Helander, M., Landauer, T., and Prabhu, P. Editors. Elsevier. p. 533--572.
    [27]
    Pascoe, J., Ryan, N., and Morse, D. (2000). Using while moving: HCI issues in fieldwork environments. ACM Transactions on Computer-Human Interaction, 7(3). p. 417--437.
    [28]
    Pauws, S., Bouwhuis, D., and Eggen, B. (2000). Programming and enjoying music with your eyes closed. ACM CHI Conference on Human Factors in Computing Systems. p. 376--383.
    [29]
    Pirhonen, A., Brewster, S., and Holguin, C. (2002). Gestural and audio metaphors as a means of control for mobile devices. ACM CHI Conference on Human Factors in Computing Systems. p. 291--298.
    [30]
    Ranjan, A., Balakrishnan, R., and Chignell, M. (2006). Searching in audio: the utility of transcripts, dichotic presentation, and time-compression. ACM CHI Conference on Human Factors in Computing Systems. p. 721--730.
    [31]
    Resnick, P., and Virzi, R. (1992). Skip and scan: cleaning up telephone interface. ACM CHI Conference on Human Factors in Computing Systems. p. 419--426.
    [32]
    Roberts, T., and Engelbeck, G. (1989). The effects of device technology on the usability of advanced telephone functions. ACM CHI Conference on Human Factors in Computing Systems. p. 331--337.
    [33]
    Rosenberger, J., and Gasco, M. (1983). Comparing location estimators: Trimmed means, medians, and trimean. in Understanding robust and exploratory data analysis, Hoagan, D., Mosteller, F., and Tukey, J., Editors. John Wiley: New York.
    [34]
    Roy, D. and Schmandt, C. (1996). NewsComm: a hand-held interface for interactive access to structured audio. ACM CHI Conference on Human Factors in Computing Systems. p. 173--180.
    [35]
    Sawhney, N., and Schmandt, C. (2000). Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments. ACM Transactions Computer-Human In-teractaction, 7(3). p. 353--383.
    [36]
    Schmandt, C. (1998). Audio hallway: a virtual acoustic environment for browsing. ACM UIST Symposium on User Interface Software and Technology. p. 163--170.
    [37]
    Schmandt, C., Lee, K., Kim, J., and Ackerman, M. (2004). Impromptu: managing networked audio applications for mobile users. ACM MobiSYS Conference on Mobile Systems, Applications, and Services. p. 59--69.
    [38]
    Shneiderman, B. (2004). Design the user interface: strategies for effective human-computer-interaction. Addison-Wesley.
    [39]
    Stifelman, L., Arons, B., and Schmandt, C. (2001). The audio notebook: paper and pen interaction with structured speech. ACM CHI Conference on Human Factors in Computing Systems. p. 182--189.
    [40]
    Stifelman, L., Arons, B., Schmandt, C., and Hulteen, E. (1993) VoiceNotes: a speech interface for a hand-held voice notetaker. ACM CHI Conference on Human Factors in Computing Systems. p. 179--186.
    [41]
    Suhm, B., Freeman, B., and Getty, B. (2001). Curing the menu blues in touch-tone voice interfaces. ACM CHI Conference on Human Factors in Computing Systems (Extended Abstracts). p. 131--132.
    [42]
    Witten, I., Cleary, J., and Greenberg, S. (1984). On frequency-based menu-splitting algorithms. International Journal of Man-Machine Studies, 21(2). p. 135--148.
    [43]
    Yin, M., and Zhai, S. (2006). The benefits of augmenting telephone voice menu navigation with visual browsing and search. ACM CHI Conference on Human Factors in Computing Systems. p. 319--328.
    [44]
    Zhao, S., Agrawala, M., and Hinckley, K. (2006). Zone and polygon menus: using relative position to increase the breadth of multi-stroke marking menus. ACM CHI Conference on Human Factors in Computing Systems. p. 1077--1086.
    [45]
    Zhao, S., and Balakrishnan, R. (2004). Simple vs. compound mark hierarchical marking menus. ACM UIST Symposium on User Interface Software and Technology. p. 33--42.

    Cited By

    View all
    • (2024)Towards a Personal Audio Space in Homes: Investigating Future Sound Management with Personal Audio TechnologiesProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3656313(276-293)Online publication date: 7-Jun-2024
    • (2023)MagKnitic: Machine-knitted Passive and Interactive Haptic Textiles with Integrated Binary SensingProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606765(1-13)Online publication date: 29-Oct-2023
    • (2023)Toucha11y: Making Inaccessible Public Touchscreens AccessibleProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581254(1-13)Online publication date: 19-Apr-2023
    • Show More Cited By

    Index Terms

    1. Earpod: eyes-free menu selection using touch input and reactive audio feedback

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '07: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2007
      1654 pages
      ISBN:9781595935939
      DOI:10.1145/1240624
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 29 April 2007

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. auditory menu
      2. gestural interaction

      Qualifiers

      • Article

      Conference

      CHI07
      Sponsor:
      CHI07: CHI Conference on Human Factors in Computing Systems
      April 28 - May 3, 2007
      California, San Jose, USA

      Acceptance Rates

      CHI '07 Paper Acceptance Rate 182 of 840 submissions, 22%;
      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI PLAY '24
      The Annual Symposium on Computer-Human Interaction in Play
      October 14 - 17, 2024
      Tampere , Finland

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)96
      • Downloads (Last 6 weeks)15

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Towards a Personal Audio Space in Homes: Investigating Future Sound Management with Personal Audio TechnologiesProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3656313(276-293)Online publication date: 7-Jun-2024
      • (2023)MagKnitic: Machine-knitted Passive and Interactive Haptic Textiles with Integrated Binary SensingProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606765(1-13)Online publication date: 29-Oct-2023
      • (2023)Toucha11y: Making Inaccessible Public Touchscreens AccessibleProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581254(1-13)Online publication date: 19-Apr-2023
      • (2023)ParaGlassMenu: Towards Social-Friendly Subtle Interactions in ConversationsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581065(1-21)Online publication date: 19-Apr-2023
      • (2023)The role of interface configuration on performance accuracy in eyes-free touchscreen interactionUniversal Access in the Information Society10.1007/s10209-023-01057-zOnline publication date: 31-Oct-2023
      • (2022)BoldMove: Enabling IoT Device Control on Ubiquitous Touch Interfaces by Semantic Mapping and Sequential SelectionExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3519805(1-7)Online publication date: 27-Apr-2022
      • (2022)Climbing Keyboard: A Tilt-Based Selection Keyboard Entry for Virtual RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2022.214412040:5(1327-1338)Online publication date: 15-Nov-2022
      • (2021)Multi-channel Tactile Feedback Based on User Finger SpeedProceedings of the ACM on Human-Computer Interaction10.1145/34885495:ISS(1-17)Online publication date: 5-Nov-2021
      • (2021)Composite Line Designs and Accuracy Measurements for Tactile Line Tracing on Touch SurfacesProceedings of the ACM on Human-Computer Interaction10.1145/34885365:ISS(1-17)Online publication date: 5-Nov-2021
      • (2021)NavStick: Making Video Games Blind-Accessible via the Ability to Look AroundThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474768(538-551)Online publication date: 10-Oct-2021
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media