Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3485279.3485297acmconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
research-article

Effect of Visual Cues on Pointing Tasks in Co-located Augmented Reality Collaboration

Published: 09 November 2021 Publication History

Abstract

Visual cues are essential in computer-mediated communication. It is especially important when communication happens in a collaboration scenario that requires focusing several users’ attention on a specific object among other similar ones. This paper explores the effect of visual cues on pointing tasks in co-located Augmented Reality (AR) collaboration. A user study (N = 32, 16 pairs) was conducted to compare two types of visual cues: Pointing Line (PL) and Moving Track (MT). Both are head-based visual techniques. Through a series of collaborative pointing tasks on objects with different states (static and dynamic) and density levels (low, medium and high), the results showed that PL was better on task performance and usability, but MT was rated higher on social presence and user preference. Based on our results, some design implications are provided for pointing tasks in co-located AR collaboration.

References

[1]
Christoph Anthes and Jens Volkert. 2005. A toolbox supporting collaboration in networked virtual environments. In International Conference on Computational Science. Springer, 383–390.
[2]
Miguel Antunes, António Rito Silva, and Jorge Martins. 2001. An abstraction for awareness management in collaborative virtual environments. In Proceedings of the ACM symposium on Virtual reality software and technology. 33–39.
[3]
Huidong Bai, Prasanth Sasikumar, Jing Yang, and Mark Billinghurst. 2020. A user study on mixed reality remote collaboration with eye gaze and hand gesture sharing. In Proceedings of the 2020 CHI conference on human factors in computing systems. 1–13.
[4]
Mark Billinghurst, Adrian Cheok, Simon Prince, and Hirokazu Kato. 2002. Real world teleconferencing. IEEE Computer Graphics and Applications 22, 6 (2002), 11–13.
[5]
Mark Billinghurst and Hirokazu Kato. 1999. Collaborative mixed reality. In Proceedings of the First International Symposium on Mixed Reality. 261–284.
[6]
Mark Billinghurst and Hirokazu Kato. 2002. Collaborative augmented reality. Commun. ACM 45, 7 (2002), 64–70.
[7]
Mark Billinghurst, Hirokazu Kato, Kiyoshi Kiyokawa, Daniel Belcher, and Ivan Poupyrev. 2002. Experiments with face-to-face collaborative AR interfaces. Virtual Reality 6, 3 (2002), 107–121.
[8]
Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. In Proceedings of the Workshop on Communication by Gaze Interaction. 1–9.
[9]
John Brooke 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4–7.
[10]
Bill Buxton. 2009. Mediaspace–meaningspace–meetingspace. In Media space 20+ years of mediated life. Springer, 217–231.
[11]
Jeffrey W Chastine, Kristine Nagel, Ying Zhu, and Luca Yearsovich. 2007. Understanding the design space of referencing in collaborative augmented reality environments. In Proceedings of graphics interface 2007. 207–214.
[12]
Lei Chen, Hai-Ning Liang, Feiyu Lu, Konstantinos Papangelis, Ka Lok Man, and Yong Yue. 2020. Collaborative behavior, performance and engagement with visual analytics tasks using mobile devices. Human-centric Computing and Information Sciences 10, 1 (2020), 1–24.
[13]
Paul Dourish and Victoria Bellotti. 1992. Awareness and coordination in shared workspaces. In Proceedings of the 1992 ACM conference on Computer-supported cooperative work. 107–114.
[14]
Thierry Duval, Thi Thuong Huyen Nguyen, Cédric Fleury, Alain Chauffaut, Georges Dumont, and Valérie Gouranton. 2014. Improving awareness for 3D virtual collaboration by embedding the features of users’ physical environments and by augmenting interaction tools with cognitive feedback cues. Journal on Multimodal User Interfaces 8, 2 (2014), 187–197.
[15]
Austin Erickson, Nahal Norouzi, Kangsoo Kim, Ryan Schubert, Jonathan Jules, Joseph J LaViola, Gerd Bruder, and Gregory F Welch. 2020. Sharing gaze rays for visual target identification tasks in collaborative augmented reality. Journal on Multimodal User Interfaces 14, 4 (2020), 353–371.
[16]
Susan R Fussell, Leslie D Setlock, Jie Yang, Jiazhi Ou, Elizabeth Mauer, and Adam DI Kramer. 2004. Gestures over video streams to support remote collaboration on physical tasks. Human-Computer Interaction 19, 3 (2004), 273–309.
[17]
Stéphanie Gerbaud and Bruno Arnaldi. 2008. Scenario sharing in a collaborative virtual environment for training. In Proceedings of the 2008 ACM symposium on Virtual Reality Software and Technology. 109–112.
[18]
Darren Gergle, Robert E Kraut, and Susan R Fussell. 2013. Using visual information for grounding and awareness in collaborative tasks. Human–Computer Interaction 28, 1 (2013), 1–39.
[19]
Saul Greenberg, Carl Gutwin, and Mark Roseman. 1996. Semantic telepointers for groupware. In Proceedings Sixth Australian Conference on Computer-Human Interaction. IEEE, 54–61.
[20]
Kunal Gupta, Gun A Lee, and Mark Billinghurst. 2016. Do you see what I see? The effect of gaze tracking on task space remote collaboration. IEEE transactions on visualization and computer graphics 22, 11(2016), 2413–2422.
[21]
Carl Gutwin, Saul Greenberg, and Mark Roseman. 1996. Workspace awareness in real-time distributed groupware: Framework, widgets, and evaluation. In People and Computers XI. Springer, 281–298.
[22]
John Paulin Hansen, Kristian Tørning, Anders Sewerin Johansen, Kenji Itoh, and Hirotaka Aoki. 2004. Gaze typing compared with input by head and hand. In Proceedings of the 2004 symposium on Eye tracking research & applications. 131–138.
[23]
Chad Harms and Frank Biocca. 2004. Internal consistency and reliability of the networked minds measure of social presence. (2004).
[24]
Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can eye help you? Effects of visualizing eye fixations on remote collaboration scenarios for physical tasks. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 5180–5190.
[25]
Weidong Huang, Leila Alem, Franco Tecchia, and Henry Been-Lirn Duh. 2018. Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration. Journal on Multimodal User Interfaces 12, 2 (2018), 77–89.
[26]
Weidong Huang, Mark Billinghurst, Leila Alem, and Seungwon Kim. 2018. HandsInTouch: sharing gestures in remote collaboration. In Proceedings of the 30th Australian Conference on Computer-Human Interaction. 396–400.
[27]
Weidong Huang, Seungwon Kim, Mark Billinghurst, and Leila Alem. 2019. Sharing hand gesture and sketch cues in remote collaboration. Journal of Visual Communication and Image Representation 58 (2019), 428–438.
[28]
Hiroshi Ishii, Minoru Kobayashi, and Kazuho Arita. 1994. Iterative design of seamless collaboration media. Commun. ACM 37, 8 (1994), 83–97.
[29]
Hiroshi Ishii, Minoru Kobayashi, and Jonathan Grudin. 1993. Integration of interpersonal space and shared workspace: ClearBoard design and experiments. ACM Transactions on Information Systems (TOIS) 11, 4 (1993), 349–375.
[30]
Dongsik Jo, Ki-Hong Kim, and Gerard Jounghyun Kim. 2016. Effects of avatar and background representation forms to co-presence in mixed reality (MR) tele-conference systems. In SIGGRAPH ASIA 2016 virtual reality meets physical reality: modelling and simulating virtual humans and environments. 1–4.
[31]
Hannes Kaufmann. 2003. Collaborative augmented reality in education. Institute of Software Technology and Interactive Systems, Vienna University of Technology(2003).
[32]
Seungwon Kim, Mark Billinghurst, and Gun Lee. 2018. The effect of collaboration styles and view independence on video-mediated remote collaboration. Computer Supported Cooperative Work (CSCW) 27, 3 (2018), 569–607.
[33]
Seungwon Kim, Gun Lee, Weidong Huang, Hayun Kim, Woontack Woo, and Mark Billinghurst. 2019. Evaluating the combination of visual communication cues for HMD-based mixed reality remote collaboration. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–13.
[34]
Seungwon Kim, Gun A Lee, and Nobuchika Sakata. 2013. Comparing pointing and drawing for remote collaboration. In 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 1–6.
[35]
Seungwon Kim, Gun A Lee, Nobuchika Sakata, Andreas Dünser, Elina Vartiainen, and Mark Billinghurst. 2013. Study of augmented gesture communication cues and view sharing in remote collaboration. In 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 261–262.
[36]
Robert E Kraut, Mark D Miller, and Jane Siegel. 1996. Collaboration in performance of physical tasks: Effects on outcomes and communication. In Proceedings of the 1996 ACM conference on Computer supported cooperative work. 57–66.
[37]
Jérémy Lacoche, Nico Pallamin, Thomas Boggini, and Jérôme Royan. 2017. Collaborators awareness for user cohabitation in co-located collaborative virtual environments. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. 1–9.
[38]
Joseph LaViola, Loring S Holden, Andrew S Forsberg, Dom S Bhuphaibool, and Robert C Zeleznik. 1998. Collaborative conceptual modeling using the sketch framework. (1998).
[39]
Gun A Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammathip Piumsomboon, Mitchell Norman, and Mark Billinghurst. 2017. Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze. In ICAT-EGVE. 197–204.
[40]
Bo Li, Ruding Lou, Javier Posselt, Frédéric Segonds, Frédéric Merienne, and Andras Kemeny. 2017. Multi-view VR system for co-located multidisciplinary collaboration and its application in ergonomic design. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. 1–2.
[41]
Xueshi Lu, Difeng Yu, Hai-NIng Liang, and Jorge Goncalves. 2021. iText: Hands-free Text Entry on an Imaginary Keyboard for Augmented Reality Systems. (2021), 1–11. https://doi.org/10.1145/3472749.3474788
[42]
Xueshi Lu, Difeng Yu, Hai-Ning Liang, Wenge Xu, Yuzheng Chen, Xiang Li, and Khalad Hasan. 2020. Exploration of Hands-free Text Entry Techniques For Virtual Reality. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 344–349. https://doi.org/10.1109/ISMAR50242.2020.00061
[43]
Stephan Lukosch, Mark Billinghurst, Leila Alem, and Kiyoshi Kiyokawa. 2015. Collaboration in augmented reality. Computer Supported Cooperative Work (CSCW) 24, 6 (2015), 515–525.
[44]
Ohan Oda, Carmine Elvezio, Mengu Sukan, Steven Feiner, and Barbara Tversky. 2015. Virtual replicas for remote assistance in virtual and augmented reality. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. 405–415.
[45]
Thammathip Piumsomboon, Arindam Dey, Barrett Ens, Gun Lee, and Mark Billinghurst. 2019. The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI 6 (2019), 5.
[46]
Thammathip Piumsomboon, Gun A Lee, Jonathon D Hart, Barrett Ens, Robert W Lindeman, Bruce H Thomas, and Mark Billinghurst. 2018. Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1–13.
[47]
Thammathip Piumsomboon, Youngho Lee, Gun Lee, and Mark Billinghurst. 2017. CoVAR: a collaborative virtual and augmented reality system for remote collaboration. In SIGGRAPH Asia 2017 Emerging Technologies. 1–2.
[48]
Jun Rekimoto and Katashi Nagao. 1995. The world through the computer: Computer augmented interaction with real world environments. In Proceedings of the 8th annual ACM symposium on User interface and software technology. 29–36.
[49]
Jeremy Roschelle and Stephanie D Teasley. 1995. The construction of shared knowledge in collaborative problem solving. In Computer supported collaborative learning. Springer, 69–97.
[50]
Nobuchika Sakata, Takeshi Kurata, Takekazu Kato, Masakatsu Kourogi, and Hideaki Kuzuoka. 2003. WACL: Supporting Telecommunications Using Wearable Active Camera with Laser Pointer. In ISWC, Vol. 2003. Citeseer, 7th.
[51]
Maurício Sousa, Rafael Kufner dos Anjos, Daniel Mendes, Mark Billinghurst, and Joaquim Jorge. 2019. Warping deixis: distorting gestures to enhance collaboration. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–12.
[52]
Oleg Špakov, Howell Istance, Kari-Jouko Räihä, Tiia Viitanen, and Harri Siirtola. 2019. Eye gaze and head gaze in collaborative games. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1–9.
[53]
Zsolt Szalavári, Erik Eckstein, and Michael Gervautz. 1998. Collaborative gaming in augmented reality. In Proceedings of the ACM symposium on Virtual reality software and technology. 195–204.
[54]
Matthew Tait and Mark Billinghurst. 2015. The effect of view independence in a collaborative AR system. Computer Supported Cooperative Work (CSCW) 24, 6 (2015), 563–589.
[55]
John C Tang and Scott L Minneman. 1991. VideoDraw: a video interface for collaborative drawing. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 170–184.
[56]
Theophilus Teo, Gun A Lee, Mark Billinghurst, and Matt Adcock. 2018. Hand gestures and visual annotation in live 360 panorama-based mixed reality remote collaboration. In Proceedings of the 30th Australian Conference on Computer-Human Interaction. 406–410.
[57]
Wenge Xu, Hai-Ning Liang, Anqi He, and Zifan Wang. 2019. Pointing and Selection Methods for Text Entry in Augmented Reality Head Mounted Displays. (2019), 279–288. https://doi.org/10.1109/ISMAR.2019.00026
[58]
Hirotake Yamazoe and Tomoko Yonezawa. 2014. Synchronized AR environment for multiple users using animation markers. In Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology. 237–238.
[59]
Jing Yang, Prasanth Sasikumar, Huidong Bai, Amit Barde, Gábor Sörös, and Mark Billinghurst. 2020. The effects of spatial auditory and visual cues on mixed reality remote collaboration. Journal on Multimodal User Interfaces 14, 4 (2020), 337–352.
[60]
Difeng Yu, Hai-Ning Liang, Xueshi Lu, Kaixuan Fan, and Barrett Ens. 2019. Modeling Endpoint Distribution of Pointing Selection Tasks in Virtual Reality Environments. ACM Trans. Graph. 38, 6, Article 218 (Nov. 2019), 13 pages. https://doi.org/10.1145/3355089.3356544

Cited By

View all
  • (2024)Fostering the AR illusion: a study of how people interact with a shared artifact in collocated augmented realityFrontiers in Virtual Reality10.3389/frvir.2024.14287655Online publication date: 20-Aug-2024
  • (2024)Cues to fast‐forward collaboration: A Survey of Workspace Awareness and Visual Cues in XR Collaborative SystemsComputer Graphics Forum10.1111/cgf.1506643:2Online publication date: 30-Apr-2024
  • (2024)Evaluation of Shared-Gaze Visualizations for Virtual Assembly Tasks2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00179(765-766)Online publication date: 16-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SUI '21: Proceedings of the 2021 ACM Symposium on Spatial User Interaction
November 2021
206 pages
ISBN:9781450390910
DOI:10.1145/3485279
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 November 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Augmented Reality
  2. Co-located collaboration
  3. Pointing Tasks
  4. Visual cues

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

SUI '21
SUI '21: Symposium on Spatial User Interaction
November 9 - 10, 2021
Virtual Event, USA

Acceptance Rates

Overall Acceptance Rate 86 of 279 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)183
  • Downloads (Last 6 weeks)23
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Fostering the AR illusion: a study of how people interact with a shared artifact in collocated augmented realityFrontiers in Virtual Reality10.3389/frvir.2024.14287655Online publication date: 20-Aug-2024
  • (2024)Cues to fast‐forward collaboration: A Survey of Workspace Awareness and Visual Cues in XR Collaborative SystemsComputer Graphics Forum10.1111/cgf.1506643:2Online publication date: 30-Apr-2024
  • (2024)Evaluation of Shared-Gaze Visualizations for Virtual Assembly Tasks2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00179(765-766)Online publication date: 16-Mar-2024
  • (2024)Evaluation on Different Object Selection Visual Feedback in Collaborative Augmented RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2389727(1-17)Online publication date: 15-Aug-2024
  • (2024) Examining the Use of DanMu for Crowdsourcing Control in Virtual Gatherings International Journal of Human–Computer Interaction10.1080/10447318.2024.2375700(1-19)Online publication date: 18-Jul-2024
  • (2024)Exploring Social Learning in Collaborative Augmented Reality With Pedagogical Agents as Learning CompanionsInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2323280(1-26)Online publication date: 11-Mar-2024
  • (2024)Examining the effect of augmented reality experience duration on reading comprehension and cognitive loadEducation and Information Technologies10.1007/s10639-024-12864-zOnline publication date: 29-Jun-2024
  • (2024)Feasibility and performance enhancement of collaborative control of unmanned ground vehicles via virtual realityPersonal and Ubiquitous Computing10.1007/s00779-024-01799-428:3-4(579-595)Online publication date: 9-May-2024
  • (2023)Evaluating User Performance, Workload, and Presence of Virtual Reality Questionnaires Using Joystick and Raycasting Selection TechniquesProceedings of the 2023 7th International Conference on Virtual and Augmented Reality Simulations10.1145/3603421.3603426(29-34)Online publication date: 3-Mar-2023
  • (2023)Workshop: Mixing Realities: Cross-Reality Visualization, Interaction, and Collaboration2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW58643.2023.00069(298-300)Online publication date: Mar-2023
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media