Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1452392.1452399acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Interaction techniques for the analysis of complex data on high-resolution displays

Published: 20 October 2008 Publication History

Abstract

When combined with the organizational space provided by a simple table, physical notecards are a powerful organizational tool for information analysis. The physical presence of these cards affords many benefits but also is a source of disadvantages. For example, complex relationships among them are hard to represent. There have been a number of notecard software systems developed to address these problems. Unfortunately, the amount of visual details in such systems is lacking compared to real notecards on a large physical table; we look to alleviate this problem by providing a digital solution. One challenge with new display technology and systems is providing an efficient interface for its users. In this paper we look at comparing different interaction techniques of an emerging class of organizational systems that use high-resolution tabletop displays. The focus of these systems is to more easily and efficiently assist interaction with information. Using PDA, token, gesture, and voice interaction techniques, we conducted a within subjects experiment comparing these techniques over a large high-resolution horizontal display. We found strengths and weaknesses for each technique. In addition, we noticed that some techniques build upon and complement others.

References

[1]
Mori. http://apokalypsesoftware.com/products/mori, 2007.
[2]
C. Andrews, T. Henry, C. Miller, and F. Quek. Cardtable: An embodied tool for analysis of historical information. 2007.
[3]
R. Ball and C. North. Visual analytics: Realizing embodied interaction for visual analytics through large displays. Comput. Graph., 31(3):380--400, 2007.
[4]
R. G. Ball. E ects of large, high-resolution displays for geospatial information visualization. PhD thesis, Virginia Tech, 2006.
[5]
R. A. Bolt. put-that-there": Voice and gesture at the graphics interface. SIGGRAPH Comput. Graph., 14(3):262--270, 1980.
[6]
B. L. Chalfonte, R. S. Fish, and R. E. Kraut. Expressive richness: a comparison of speech and text as media for revision. In CHI '91, pp. 21--26, 1991.
[7]
N. T. Dang, M. Tavanti, I. Rankin, and M. Cooper. A comparison of di fferent input devices for a 3d environment. In Proc. 14th European conf. on Cog. ergonomics, pp. 153--160, 2007.
[8]
P. Dietz and D. Leigh. Diamondtouch: a multi-user touch technology. In UIST '01, pp. 219--226, 2001.
[9]
A. Esenther and K. Ryall. Fluid dtmouse: better mouse support for touch-based interactions. In Proc. Working Conf. on Advanced Visual Interfaces, pp. 112--115, 2006.
[10]
EverNote. http://www.evernote.com/, 2007.
[11]
C. Forlines, D. Wigdor, C. Shen, and R. Balakrishnan. Direct-touch vs. mouse input for tabletop displays. In CHI '07, pp. 647--656,2007.
[12]
M. A. Grasso, D. S. Ebert, and T. W. Finin. The integrality of speech in multimodal interfaces. ACM Trans. Comput.-Hum. Interact., 5(4):303--325, 1998.
[13]
F. Guimbretiffere, M. Stone, and T. Winograd. Fluid interaction with high-resolution wall-size displays. In UIST '01, pp. 21--30, 2001.
[14]
F. G. Halasz, T. P. Moran, and R. H. Trigg. Notecards in a nutshell. In CHI '87, pp. 45--52, 1987.
[15]
H. Ishii and B. Ullmer. Tangible bits: towards seamless interfaces between people, bits and atoms. In CHI '97, pp. 234--241, 1997.
[16]
S. R. Klemmer, J. Graham, G. J. Wol, and J. A. Landay. Books with voices: paper transcripts as a physical interface to oral histories. In CHI '03, pp. 89--96, 2003.
[17]
C. Magerkurth, M. Memisoglu, T. Engelke, and N. Streitz. Towards the next generation of tabletop gaming experiences. In Proc. Conf. on Graphics Interface, pp. 73--80, 2004.
[18]
C. C. Marshall, F. G. Halasz, R. A. Rogers, and J. William C. Janssen. Aquanet: a hypertext tool to hold your knowledge in place. In HYPERTEXT '91, pp. 261--275, 1991.
[19]
S. Meyer, O. Cohen, and E. Nilsen. Device comparisons for goal-directed drawing tasks. In CHI '94, pp. 251--252, New York, NY, USA, 1994. ACM.
[20]
C. Neuwirth, D. Kaufer, R. Chimera, and T. Gillespie. The notes program: a hypertext application for writing from source texts. In HYPERTEXT '87: Proceeding of the ACM conference on Hypertext, pp. 121--141, 1987.
[21]
S. Oviatt. Taming recognition errors with a multimodal interface. Commun. ACM, 43(9):45--51, 2000.
[22]
J. Patten, H. Ishii, J. Hines, and G. Pangaro. Sensetable: a wireless object tracking platform for tangible user interfaces. In CHI '01, pp. 253--260, 2001.
[23]
A. M. Piper, E. O'Brien, M. R. Morris, and T. Winograd. Sides: a cooperative tabletop computer game for social skills development. In CSCW '06, pp. 1--10, 2006.
[24]
J. Rekimoto, B. Ullmer, and H. Oba. Datatiles: a modular platform for mixed physical and graphical interactions. In CHI '01, pp. 269--276, 2001.
[25]
A. J. Sabri, R. G. Ball, A. Fabian, S. Bhatia, and C. North. High--resolution gaming: Interfaces, notifications, and the user experience. Interact. Comput., 19(2):151--166, 2007.
[26]
T. W. Schneider and O. Balci. Vtquest: a voice-based multimodal web-based software system for maps and directions. In Proc. 44th annual Southeast Regional Conf., pp. 300--305, 2006.
[27]
S. D. Scott, M. Sheelagh, T. Carpendale, and K. M. Inkpen. Territoriality in collaborative tabletop workspaces. In CSCW '04, pp. 294--303, 2004.
[28]
C. Shen. Ubitable: Impromptu face-to-face collaboration on horizontal interactive surfaces, 2003.
[29]
C. Shen, F. D. Vernier, C. Forlines, and M. Ringel. Diamondspin: an extensible toolkit for around-the-table interaction. In CHI '04, pp. 167--174, 2004.
[30]
J. P. Springer, C. Sladeczek, M. Scheffer, J. Hochstrate, B. Frohlich, and F. Melchior. A survey of large high-resolution display technologies, techniques, and applications. In VR '06, p. 31, 2006.
[31]
N. A. Streitz, J. Gei'ler, T. Holmer, S. Konomi, C. Muller-Tomfelde, W. Reischl, P. Rexroth, P. Seitz, and R. Steinmetz. i-land: an interactive landscape for creativity and innovation. In CHI '99, pp. 120--127, 1999.
[32]
SuperNoteCard. http://www.mindola.com/snc/index.html, 2007.
[33]
B. Ullmer and H. Ishii. The metadesk: models and prototypes for tangible user interfaces. In UIST '97, pp. 223--232, 1997.
[34]
B. Ullmer, H. Ishii, and R. J. K. Jacob. Token+constraint systems for tangible interaction with digital information. ACM Trans. Comput.-Hum. Interact., 12(1):81--118, 2005.
[35]
M. Wu and R. Balakrishnan. Multi-nger and whole hand gestural interaction techniques for multi-user tabletop displays. In UIST '03, pp. 193--202, 2003.

Cited By

View all
  • (2011)Toward multimodal situated analysisProceedings of the 13th international conference on multimodal interfaces10.1145/2070481.2070526(239-246)Online publication date: 14-Nov-2011
  • (2010)Structuring ordered nominal data for event sequence discoveryProceedings of the 18th ACM international conference on Multimedia10.1145/1873951.1874153(1075-1078)Online publication date: 25-Oct-2010

Index Terms

  1. Interaction techniques for the analysis of complex data on high-resolution displays

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICMI '08: Proceedings of the 10th international conference on Multimodal interfaces
    October 2008
    322 pages
    ISBN:9781605581989
    DOI:10.1145/1452392
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 20 October 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. embodied interaction
    2. gesture interaction
    3. high-resolution displays
    4. horizontal display
    5. human-computer interaction
    6. multimodal interfaces
    7. pda interaction
    8. tabletop interaction
    9. tangible interaction
    10. voice interaction

    Qualifiers

    • Research-article

    Conference

    ICMI '08
    Sponsor:
    ICMI '08: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERFACES
    October 20 - 22, 2008
    Crete, Chania, Greece

    Acceptance Rates

    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)2
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 02 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2011)Toward multimodal situated analysisProceedings of the 13th international conference on multimodal interfaces10.1145/2070481.2070526(239-246)Online publication date: 14-Nov-2011
    • (2010)Structuring ordered nominal data for event sequence discoveryProceedings of the 18th ACM international conference on Multimedia10.1145/1873951.1874153(1075-1078)Online publication date: 25-Oct-2010

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media