Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1344471.1344515acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Deixis and gaze in collaborative work at a distance (over a shared map): a computational model to detect misunderstandings

Published: 26 March 2008 Publication History

Abstract

This paper presents an algorithm that detects misunderstandings in collaborative work at a distance. It analyses the movements of collaborators' eyes on the shared workspace, their utterances containing references about this workspace, and the availability of 'remote' deictic gestures. This method is based on two findings: 1. participants look at the points they are talking about in their message; 2. their gazes are more dense around these points compared to other random looks in the same timeframe. The algorithm associates the distance between the gazes of the emitter and gazes of the receiver of a message with the probability that the recipient did not understand the message.

References

[1]
Bauer, M., Kortuem, G., and Segall, Z. 1999. "where are you pointing at?" a study of remote collaboration in a wearable videoconference system. In Proc. ISWC '99, 151--158.
[2]
Brennan, S. E. 1990. Seeking and Providing Evidence for Mutual Understanding. PhD thesis, Department of Psychology, Stanford University, Stanford, CA, USA.
[3]
Cherubini, M., and Dillenbourg, P. 2007. The effects of explicit referencing in distance problem solving over shared maps. In Proc. GROUP'07, 331--340.
[4]
Clark, H., and Brennan, S. 1991. In L. Resnick, J. Levine and S. Teasley, editors, Perspectives on Socially Shared Cognition. American Psychological Association, Washington, ch. Grounding in Communication, 127--149.
[5]
Clark, H. H., and Krych, M. A. 2004. Speaking while monitoring addressees for understanding. Journal of Memory and Language, 50, 62--81.
[6]
Clark, H. H., and Shaefer, E. F. 1989. Contributing to a discourse. Cognitive Science, 13, 259--294.
[7]
Clark, H. H. 1996. Using Language. Cambridge University Press, Cambridge, UK.
[8]
Clark, H. H. 2003. Pointing: Where language, culture, and cognition meet. Lawrence Erlbaum Associates, Mahwah, NJ, USA, ch. Pointing and placing, 243--268.
[9]
Fussell, S. R., Setlock, L. D., Yang, J., Ou, J., Mauer, E., and Kramer, A. D. I. 2004. Gestures over video streams to support remote collaboration on physical tasks. Human-Computer Interaction 19, 3, 273--309.
[10]
Griffin, Z. M., and Bock, K. 2000. What the eyes say about speaking. Psychological Science 11, 4 (July), 274--279.
[11]
Gutwin, C., and Greenberg, S. 1998. The effects of workspace awareness on the usability of real-time distributed groupware. ACM Transactions on Computer-Human Interaction 6, 3, 243--281.
[12]
Gutwin, C., and Greenberg, S. 2004. The importance of awareness for team cognition in distributed collaboration. In Team Cognition: Understanding the Factors that Drives Process and Performance, APA Press, Washington, E. Salas, S. Fiore, and J. Cannon-Bowers, Eds., 177--201.
[13]
Ishii, H., and Kobayashi, M. 1992. Clearboard: A seamless medium for shared drawing and conversation with eye contact. In Proc. CHI'92, 525--532.
[14]
Kirk, D., Rodden, T., and Fraser, D. S. 2007. Turn it this way: grounding collaborative action with remote gestures. In Proc. CHI'07, ACM Press, 1039--1048.
[15]
Kraut, R. E., Gergle, D., and Fussell, S. R. 2002. The use of visual information in shared visual spaces: Informing the development of virtual co-presence. In Proc. CSCW'02, MIT Press, New York, NY, USA, 31--40.
[16]
Kraut, R. E., Fussell, S. R., and Siegel, J. 2003. Visual information as a conversational resource in collaborative physical tasks. Human-Computer Interaction 18, 13--49.
[17]
Monk, A. F., and Gale, C. 2002. A look is worth a thousand words: Full gaze awareness in video-mediated conversation. Discourse Processes 33, 3, 257--278.
[18]
Mühlpfordt, M., and Wessner, M. 2005. Explicit referencing in chat supports collaborative learning. In Proc. CSCL'05, 460--469.
[19]
Nardi, B. A. 2005. Beyond bandwidth: Dimensions of connection in interpersonal communication. Computer Supported Cooperative Work 14, 2, 91--130.
[20]
Ou, J., Oh, L. M., Fussell, S. R., Blum, T., and Yang, J. 2005. Analyzing and predicting focus of attention in remote collaborative tasks. In Proc. ICMI '05, 116--123.
[21]
Qvarfordt, P., Beymer, D., and Zhai, S. 2005. Realtourist: A study of augmenting human-human and human-computer dialogue with eye-gaze overlay. In Proc. INTERACT'05, 767--780.
[22]
Richardson, D. C., and Dale, R. 2005. Looking to understand: The coupling between speakers' and listeners' eye movements and its relationship to discourse comprehension. Cognitive Science, 29, 1045--1060.
[23]
Salvucci, D. D. 1999. Inferring intent in eye-based interfaces: Tracing eye movements with process models. In Proc. CHI'99, 254--261.
[24]
Santella, A., and DeCarlo, D. 2004. Robust clustering of eye movement recordings for quantification of visual interest. In Proc. ETRA'04, 27--34.
[25]
Vandeloise, C. 1991. Spatial Prepositions: a Case Study from French. The University of Chicago Press.
[26]
Velichkovsky, B. M. 1995. Communicating attention: Gaze position transfer in cooperative problem solving. Pragmatics and Cognition 3, 2, 199--222.
[27]
Vertegaal, R. 1999. The gaze groupware system: mediating joint attention in multiparty communication and collaboration. In Proc. CHI'99, ACM Press, 294--301.
[28]
Whittaker, S., Geelhhoed, E., and Robinson, E. 1993. Shared workspaces: how do they work and when are they useful? Int. J. Man-Mach. Stud. 39, 5 (November), 813--842.

Cited By

View all
  • (2024)Enhancing Collaborative Shopping Experience Through Interactive Personalized Avatars and Shared Gaze in a Multi-User Augmented Reality EnvironmentInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2381923(1-22)Online publication date: 29-Jul-2024
  • (2024)Carry-forward effect: providing proactive scaffolding to learning processesBehaviour & Information Technology10.1080/0144929X.2024.2411592(1-40)Online publication date: 16-Oct-2024
  • (2024)Differences Between Experienced and Preservice Teachers in Noticing Students’ Collaborative Problem-Solving ProcessesStudents’ Collaborative Problem Solving in Mathematics Classrooms10.1007/978-981-99-7386-6_9(219-241)Online publication date: 4-Jan-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '08: Proceedings of the 2008 symposium on Eye tracking research & applications
March 2008
285 pages
ISBN:9781595939821
DOI:10.1145/1344471
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 March 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. computer-supported collaborative work
  2. focus of attention
  3. remote collaborative tasks
  4. remote deixis
  5. spatial cognition

Qualifiers

  • Research-article

Funding Sources

Conference

ETRA '08
ETRA '08: Eye Tracking Research and Applications
March 26 - 28, 2008
Georgia, Savannah

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)16
  • Downloads (Last 6 weeks)3
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Enhancing Collaborative Shopping Experience Through Interactive Personalized Avatars and Shared Gaze in a Multi-User Augmented Reality EnvironmentInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2381923(1-22)Online publication date: 29-Jul-2024
  • (2024)Carry-forward effect: providing proactive scaffolding to learning processesBehaviour & Information Technology10.1080/0144929X.2024.2411592(1-40)Online publication date: 16-Oct-2024
  • (2024)Differences Between Experienced and Preservice Teachers in Noticing Students’ Collaborative Problem-Solving ProcessesStudents’ Collaborative Problem Solving in Mathematics Classrooms10.1007/978-981-99-7386-6_9(219-241)Online publication date: 4-Jan-2024
  • (2023)Supporting Collaboration in Introductory Programming Classes Taught in Hybrid Mode: A Participatory Design StudyProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3596042(1248-1262)Online publication date: 10-Jul-2023
  • (2022)CalmResponses: Displaying Collective Audience Reactions in Remote CommunicationProceedings of the 2022 ACM International Conference on Interactive Media Experiences10.1145/3505284.3529959(193-208)Online publication date: 21-Jun-2022
  • (2022)Using Speech to Visualise Shared Gaze Cues in MR Remote Collaboration2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00044(250-259)Online publication date: Mar-2022
  • (2022)Collaborative eye tracking based code review through real-time shared gaze visualizationFrontiers of Computer Science: Selected Publications from Chinese Universities10.1007/s11704-020-0422-116:3Online publication date: 1-Jun-2022
  • (2022)Many are the ways to learn identifying multi-modal behavioral profiles of collaborative learning in constructivist activitiesInternational Journal of Computer-Supported Collaborative Learning10.1007/s11412-021-09358-216:4(485-523)Online publication date: 21-Jan-2022
  • (2022)An investigation of the relationship between joint visual attention and product quality in collaborative business process modeling: a dual eye-tracking studySoftware and Systems Modeling10.1007/s10270-022-00974-621:6(2429-2460)Online publication date: 14-Feb-2022
  • (2021)How Can High-Frequency Sensors Capture Collaboration? A Review of the Empirical Links between Multimodal Metrics and Collaborative ConstructsSensors10.3390/s2124818521:24(8185)Online publication date: 8-Dec-2021
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media