Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1054972.1055005acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Effects of task properties, partner actions, and message content on eye gaze patterns in a collaborative task

Published: 02 April 2005 Publication History

Abstract

Helpers providing guidance for collaborative physical tasks shift their gaze between the workspace, supply area, and instructions. Understanding when and why helpers gaze at each area is important both for a theoretical understanding of collaboration on physical tasks and for the design of automated video systems for remote collaboration. In a laboratory experiment using a collaborative puzzle task, we recorded helpers' gaze while manipulating task complexity and piece differentiability. Helpers gazed toward the pieces bay more frequently when pieces were difficult to differentiate and less frequently over repeated trials. Preliminary analyses of message content show that helpers tend to look at the pieces bay when describing the next piece and at the workspace when describing where it goes. The results are consistent with a grounding model of communication, in which helpers seek visual evidence of understanding unless they are confident that they have been understood. The results also suggest the feasibility of building automated video systems based on remote helpers' shifting visual requirements.

References

[1]
Argyle, M., & Cook, M. (1976). Gaze and Mutual Gaze. Cambridge: Cambridge University Press.]]
[2]
Brumitt B., Krumm J., Meyers B., & Shafer S. (2000). Let there be light: Comparing interfaces for homes of the future. IEEE Personal Communications, August 2000.]]
[3]
Campana, E., Baldridge, J., Dowding, J., Hockey, B. A., Remington, R. W., & Stone, L. S. (2001). Using eye movements to determine referents in a spoken dialogue system. In Proceedings of the 2001 workshop on Perceptive user interfaces (pp. 1--5).]]
[4]
Chambers, C. G., Tanenhaus, M. K., Eberhard, K. M., Filip, H., & Carlson, G. N. (2002). Circumscribing referential domains during real-time language comprehension. Journal of Memory and Language, 47, 30--49.]]
[5]
Clark, H. H. & Marshall, C. E. (1981). Definite reference and mutual knowledge. In A. K. Joshi, B. L. Webber & I. A. Sag (Eds.), Elements of discourse understanding (pp. 10--63). Cambridge: Cambridge University Press.]]
[6]
Clark, H. H. & Wilkes-Gibbs, D. (1986). Referring as a collaborative process. Cognition, 22, 1--39.]]
[7]
Clark, H. H. (1996). Using language. Cambridge, England: Cambridge University Press.]]
[8]
Dabbish, L. & Kraut R. (2004). Controlling interruptions: Awareness displays and social motivation for coordination. Proceedings of CSCW 2004 (pp. 182--191). NY: ACM Press.]]
[9]
Eberhard, K. M., Spivey-Knowlton, M. J., Sedivy, J. C., & Tanenhaus, M. K. (1995). Eye movements as a window into real-time spoken language processing in natural contexts. Journal of Psycholinguistic Research, 24, 409--436.]]
[10]
Endsley, M. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37, 32--64.]]
[11]
Farid. M, Murtagh., F., and Starck., J.L. (2002) Computer display control and interaction using eye-gaze, Journal of the Society for Information Display, 10, 289--293.]]
[12]
Ford, C. E. (1999). Collaborative construction of task activity: Coordinating multiple resources in a high school physics lab. Research on Language and Social Interaction, 32, 369--408.]]
[13]
Frey L. A., White, K. P. Jr., & Hutchinson T. E. (1990). Eye-gaze word processing. IEEE Transactions on Systems, Man and Cybernetics, 20, 944--950.]]
[14]
Fussell, S. R., Kraut, R. E., & Siegel, J. (2000). Coordination of communication: Effects of shared visual context on collaborative work. Proceedings of CSCW 2000 (pp. 21--30). NY: ACM Press.]]
[15]
Fussell, S. R., Setlock, L. D., Yang, J., Ou, J., Mauer, E. M., & Kramer, A. (2004). Gestures over video streams to support remote collaboration on physical tasks. Human-Computer Interaction, 19, 273--309.]]
[16]
Fussell, S. R., Setlock, L. D., & Kraut, R. E. (2003). Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks. Proceedings of CHI 2003 (pp. 513--520). NY: ACM Press.]]
[17]
Fussell, S. R., Setlock, L. D., & Parker, E. M. (2003). Where do helpers look? Gaze targets during collaborative physical tasks. CHI 2003 Extended Abstracts (pp. 768--769). NY: ACM Press.]]
[18]
Gaver, W., Sellen, A., Heath, C., & Luff, P. (1993) One is not enough: Multiple views in a media space. Proceedings of Interchi '93 (pp. 335--341). NY: ACM Press.]]
[19]
Gergle, D., Kraut, R.E., & Fussell, S.R. (2004). Action as language in a shared visual space. Proceedings of CSCW 2004 (pp. 487--496). NY: ACM Press.]]
[20]
Gergle, D., Millan, D. R., Kraut, R. E., & Fussell, S. R. (2004). Persistence matters: Making the most of chat in tightly-coupled work. CHI 2004 (pp. 431--438). NY: ACM Press.]]
[21]
Goodwin, C. (1996). Professional vision. American Anthropologist, 96, 606--633.]]
[22]
Gullberg, M. (2003). Eye movements and gestures in human face-to-face interaction. In J. Hyona, R. Radach, & H. Deubel, (Eds.) The Mind's Eyes: Cognitive and Applied Aspects of Eye Movements (pp. 685--703). Oxford: Elsevier Science.]]
[23]
Hutchinson T. E., White, K. P. Jr., Martin, W. N., Reichert, K. C., & Frey L. A. (1989). Human-computer interaction using eye-gaze input. IEEE Transaction on Systems, Man, and Cybernetics, 19, 1527--1534.]]
[24]
Jacob, R. J. K. (1993). Eye-movement-based human-computer interaction techniques. In H. R. Hartson & D. Hix (Eds.), Advances in Human-Computer Interaction, Vol. 4 (pp. 151--190). Norwood, NJ: Ablex.]]
[25]
Jefferson, G. (1972). Side sequences. In D. Sudnow (Ed.) Studies in social interaction (pp. 294--338). NY: Free Press.]]
[26]
Keysar, B., Barr, D. J., Balin, J. A., & Brauner, J. S. (2000). Taking perspective in conversation: The role of mutual knowledge in comprehension. Psychological Science, 11, 32--38.]]
[27]
Kraut, R. E., Fussell, S. R., & Siegel, J. (2003). Visual information as a conversational resource in collaborative physical tasks. Human Computer Interaction, 18, 13--49.]]
[28]
Kraut, R.E., Gergle, D., & Fussell, S.R. (2002). The Use of visual information in shared visual spaces: Informing the development of virtual co-presence. In Proceedings of CSCW 2002 (pp. 31--40). NY: ACM Press.]]
[29]
Kraut, R. E., Miller, M. D., & Siegel, J. (1996) Collaboration in performance of physical tasks: Effects on outcomes and communication. Proceedings of CSCW 1996 (pp. 57--66). NY ACM.]]
[30]
Kuzuoka, H., Kosuge, T., & Tanaka, K. (1994) GestureCam: A video communication system for sympathetic remote collaboration. Proceedings of CSCW 1994 (pp. 35--43). NY: ACM.]]
[31]
Kuzuoka, H., Oyama, S., Yamazaki, K., Suzuki, K., & Mitsuishi, M. (2000). GestureMan: A mobile robot that embodies a remote instructor's actions. Proceedings of CSCW 2000 (pp. 155--162). NY: ACM Press.]]
[32]
Land, M., & Hayhoe, M. (2001). In what ways do eye movements contribute to everyday activities. Vision Research, 41, 3559--3565.]]
[33]
Land, M., Mennie, N., & Rusted, J. (1999). The roles of vision and eye movements in the control of activities of daily living. Perception, 28, 1307--1432.]]
[34]
Maglio P., Matlock T., Campbell C. S., Zhai S., & Smith, B. A. (2000). Gaze and speech in attentive user interfaces. Proceedings of the International Conference on Multimodal Interfaces. Springer.]]
[35]
Oh, K., Kramer, A. D. I., & Fussell, S. R. (in preparation). Comparison of scene and laser pointing video systems for remote collaboration on physical tasks.]]
[36]
Ou, J., Fussell, S. R., Chen, X., Setlock, L. D., & Yang, J. (2003). Gestural communication over video stream: Supporting multimodal interaction for remote collaborative physical tasks. In Proceedings of International Conference on Multimodal Interfaces, Nov. 5-7, 2003, Vancouver, Canada.]]
[37]
Ou, J. (unpublished). DOVE-2: Combining gesture with remote camera control.]]
[38]
Oudejans, R. R. D., Michaels, C. F., Bakker, F. C., & Davids, K. (1999). Shedding some light on catching in the dark: Perceptual mechanisms for catching fly balls. Journal of Experimental Psychology: Human Perception and Performance, 25, 531--542.]]
[39]
Daly-Jones, O., Monk, A. & Watts, L. (1998). Some advantages of video conferencing over high-quality audio conferencing: fluency and awareness of attentional focus. International Journal of Human-Computer Studies, 49, 21--58.]]
[40]
Pelz, J. B., & Canosa, R. (2001). Oculomotor behavior and perceptual strategies in complex tasks. Vision Research, 41, 3587--3596.]]
[41]
Salvucci, D. (1999). Inferring intent in eye-based interfaces: Tracing eye movements with process models. Proceedings of CHI 1999 (pp. 254--261). NY: ACM Press.]]
[42]
Stiefelhagen, R., Yang, J., & Waibel, A. (2002). Modeling focus of attention for meeting indexing based on multiple cues. IEEE Transactions on Neural Networks, 13, 928--938.]]
[43]
Tang, J. C. (1991). Findings from observational studies of collaborative work. International Journal of Man-Machine Studies, 34, 143--160.]]
[44]
Veinott, E., Olson, J., Olson, G., & Fu, X. (1999). Video helps remote work: Speakers who need to negotiate common ground benefit from seeing each other. Proceedings of CHI 1999 (pp. 302--309). NY: ACM Press.]]
[45]
Vertegaal, R., Slagter, R., van der Veer, G., & Nijholt, A. (2001). Eye gaze patterns in conversations: There is more to conversational agents than meets the eyes. Proceedings of CHI 2001 (pp. 301--308). NY: ACM Press.]]
[46]
Vickers, J. N. (1996). Visual control when aiming at a far target. Journal of Experimental Psychology: Human Perception and Performance, 22, 342--354.]]
[47]
Whittaker, S., & O'Conaill, B. (1997). The role of vision in face-to-face and mediated communication. In K. Finn, A.Sellen & S. Wilbur (Eds.) Video-Mediated Communication (pp. 23--49). Mahwah, NJ: Erlbaum.]]

Cited By

View all
  • (2024)Communication Cues for Remote GuidanceComputer‐Supported Collaboration10.1002/9781119719830.ch4(81-114)Online publication date: 24-May-2024
  • (2024)Supporting Workspace Awareness with Augmented Reality‐Based Multi‐camera Visualization and TrackingComputer‐Supported Collaboration10.1002/9781119719830.ch12(299-325)Online publication date: 24-May-2024
  • (2023)Measuring and Comparing Collaborative Visualization Behaviors in Desktop and Augmented Reality EnvironmentsProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615691(1-11)Online publication date: 9-Oct-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '05: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
April 2005
928 pages
ISBN:1581139985
DOI:10.1145/1054972
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 April 2005

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. collaborative work
  2. computer-supported
  3. conversational analysis
  4. empirical studies
  5. eye-tracking
  6. gesture
  7. video conferencing
  8. video mediated communication

Qualifiers

  • Article

Conference

CHI05
Sponsor:

Acceptance Rates

CHI '05 Paper Acceptance Rate 93 of 372 submissions, 25%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)0
Reflects downloads up to 22 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Communication Cues for Remote GuidanceComputer‐Supported Collaboration10.1002/9781119719830.ch4(81-114)Online publication date: 24-May-2024
  • (2024)Supporting Workspace Awareness with Augmented Reality‐Based Multi‐camera Visualization and TrackingComputer‐Supported Collaboration10.1002/9781119719830.ch12(299-325)Online publication date: 24-May-2024
  • (2023)Measuring and Comparing Collaborative Visualization Behaviors in Desktop and Augmented Reality EnvironmentsProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615691(1-11)Online publication date: 9-Oct-2023
  • (2022)RemoteCoDe: Robotic Embodiment for Enhancing Peripheral Awareness in Remote Collaboration TasksProceedings of the ACM on Human-Computer Interaction10.1145/35129106:CSCW1(1-22)Online publication date: 7-Apr-2022
  • (2022)A review on communication cues for augmented reality based remote guidanceJournal on Multimodal User Interfaces10.1007/s12193-022-00387-116:2(239-256)Online publication date: 12-Apr-2022
  • (2021)Mental Synchronization in Human Task Demonstration: Implications for Robot Teaching and LearningCompanion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3434074.3447216(470-474)Online publication date: 8-Mar-2021
  • (2020)Usage and Effect of Eye Tracking in Remote GuidanceProceedings of the 32nd Australian Conference on Human-Computer Interaction10.1145/3441000.3441051(622-628)Online publication date: 2-Dec-2020
  • (2020)Using a Head Pointer or Eye Gaze: The Effect of Gaze on Spatial AR Remote Collaboration for Physical TasksInteracting with Computers10.1093/iwcomp/iwaa01232:2(153-169)Online publication date: 27-Jul-2020
  • (2019)A gesture- and head-based multimodal interaction platform for MR remote collaborationThe International Journal of Advanced Manufacturing Technology10.1007/s00170-019-04434-2105:7-8(3031-3043)Online publication date: 9-Nov-2019
  • (2017)The influence of visual feedback and gender dynamics on performance, perception and communication strategies in CSCWInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2016.09.00397(162-181)Online publication date: Jan-2017
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media