Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- research-articleAugust 2024
Gazing Heads: Investigating Gaze Perception in Video-Mediated Communication
ACM Transactions on Computer-Human Interaction (TOCHI), Volume 31, Issue 3Article No.: 39, Pages 1–31https://doi.org/10.1145/3660343Videoconferencing has become a ubiquitous medium for collaborative work. It does suffer however from various drawbacks such as zoom fatigue. This paper addresses the quality of user experience by exploring an enhanced system concept with the capability of ...
- Work in ProgressMay 2024
Enhancing Online Meeting Experience through Shared Gaze-Attention
CHI EA '24: Extended Abstracts of the CHI Conference on Human Factors in Computing SystemsArticle No.: 128, Pages 1–6https://doi.org/10.1145/3613905.3651068Eye contact represents a fundamental element of human social interactions, providing essential non-verbal signals. Traditionally, it has played a crucial role in fostering social bonds during in-person gatherings. However, in the realm of virtual and ...
- short-paperMarch 2024
Eye See You: The Emotionally Intelligent Anthropomorphic Robot Enhancing Smartphone Interaction
HRI '24: Companion of the 2024 ACM/IEEE International Conference on Human-Robot InteractionPages 1269–1272https://doi.org/10.1145/3610978.3641267Mobile phones have become an indispensable part of people's daily lives, closely connected to emotions and needs, making emotional design for phones increasingly crucial. To expand the functionality of smartphones, we have designed ORBO, a robot that ...
- short-paperMarch 2024
ORBO: The Emotionally Intelligent Anthropomorphic Robot Enhancing Smartphone Interaction
HRI '24: Companion of the 2024 ACM/IEEE International Conference on Human-Robot InteractionPages 1178–1182https://doi.org/10.1145/3610978.3640695Smartphones have become an integral part of people's daily lives, closely linked to emotions and needs, making emotional design increasingly important. Therefore, we designed the robot ORBO, expanding the functionality of smartphones. ORBO focuses on ...
- Work in ProgressApril 2023
VR, Gaze, and Visual Impairment: An Exploratory Study of the Perception of Eye Contact across different Sensory Modalities for People with Visual Impairments in Virtual Reality
CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing SystemsArticle No.: 313, Pages 1–6https://doi.org/10.1145/3544549.3585726As social virtual reality (VR) becomes more popular, avatars are being designed with realistic behaviors incorporating non-verbal cues like eye contact. However, perceiving eye contact during a conversation can be challenging for people with visual ...
-
- research-articleDecember 2021
Project starline: a high-fidelity telepresence system
- Jason Lawrence,
- Danb Goldman,
- Supreeth Achar,
- Gregory Major Blascovich,
- Joseph G. Desloge,
- Tommy Fortes,
- Eric M. Gomez,
- Sascha Häberling,
- Hugues Hoppe,
- Andy Huibers,
- Claude Knaus,
- Brian Kuschak,
- Ricardo Martin-Brualla,
- Harris Nover,
- Andrew Ian Russell,
- Steven M. Seitz,
- Kevin Tong
ACM Transactions on Graphics (TOG), Volume 40, Issue 6Article No.: 242, Pages 1–16https://doi.org/10.1145/3478513.3480490We present a real-time bidirectional communication system that lets two people, separated by distance, experience a face-to-face conversation as if they were copresent. It is the first telepresence system that is demonstrably better than 2D ...
- research-articleOctober 2021
GazeChat: Enhancing Virtual Conferences with Gaze-aware 3D Photos
UIST '21: The 34th Annual ACM Symposium on User Interface Software and TechnologyPages 769–782https://doi.org/10.1145/3472749.3474785Communication software such as Clubhouse and Zoom has evolved to be an integral part of many people’s daily lives. However, due to network bandwidth constraints and concerns about privacy, cameras in video conferencing are often turned off by ...
- research-articleNovember 2020
MaskMe: Using Masks to Design Collaborative Games for Helping Children with Autism Make Eye Contact
ISS Companion '20: Companion Proceedings of the 2020 Conference on Interactive Surfaces and SpacesPages 29–32https://doi.org/10.1145/3380867.3426206Eye contact disorder is one of the typical characteristics of children with Autism Spectrum Disorder (ASD), and this issue is directly related to the deficiencies in social interaction. In this poster, we present MaskMe, a collaborative game to help ...
- abstractDecember 2020
Using Markov Models and Classification to Understand Face Exploration Dynamics in Boys with Autism
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPage 507https://doi.org/10.1145/3395035.3425179Scanning faces is important for social interactions, and maintaining good eye contact carries significant social value. Difficulty with the social use of eye contact constitutes one of the clinical symptoms of autism spectrum disorder (ASD). It has been ...
- abstractApril 2020
Investigating the Effectiveness of Different Interaction Modalities for Spatial Human-robot Interaction
HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot InteractionPages 239–241https://doi.org/10.1145/3371382.3378273With the increasing use of social robots in real environments, one of the areas of research requiring more attention is the study of human-robot interaction (HRI) when a person and robot are moving close to each other. Understanding effective ways to ...
- research-articleOctober 2017
Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery
UIST '17: Proceedings of the 30th Annual ACM Symposium on User Interface Software and TechnologyPages 193–203https://doi.org/10.1145/3126594.3126614Eye contact is an important non-verbal cue in social signal processing and promising as a measure of overt attention in human-object interactions and attentive user interfaces. However, robust detection of eye contact across different users, gaze ...
- research-articleSeptember 2017
Detecting Gaze Towards Eyes in Natural Social Interactions and Its Use in Child Assessment
- Eunji Chong,
- Katha Chanda,
- Zhefan Ye,
- Audrey Southerland,
- Nataniel Ruiz,
- Rebecca M. Jones,
- Agata Rozga,
- James M. Rehg
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), Volume 1, Issue 3Article No.: 43, Pages 1–20https://doi.org/10.1145/3131902Eye contact is a crucial element of non-verbal communication that signifies interest, attention, and participation in social interactions. As a result, measures of eye contact arise in a variety of applications such as the assessment of the social ...
- research-articleMay 2017
A Multifaceted Study on Eye Contact based Speaker Identification in Three-party Conversations
CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing SystemsPages 3011–3021https://doi.org/10.1145/3025453.3025644To precisely understand human gaze behaviors in three-party conversations, this work is dedicated to look into whether the speaker can be reliably identified from the interlocutors in a three-party conversation on the basis of the interactive behaviors ...
- research-articleJanuary 2017
Gesture triggered, dynamic gaze alignment architecture for intelligent eLearning systems
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology (JIFS), Volume 32, Issue 4Pages 2963–2969https://doi.org/10.3233/JIFS-169239Current eLearning systems enable streaming of live lectures to distant students facilitating a live instructor-student interaction. However, studies have shown that there exists a marked divide in local students’ (student present in the teacher’s location)...
- research-articleNovember 2016
A multimodal and multilevel system for robotics treatment of autism in children
DAA '16: Proceedings of the International Workshop on Social Learning and Multimodal Interaction for Designing Artificial AgentsArticle No.: 3, Pages 1–6https://doi.org/10.1145/3005338.3005341Several studies suggest that robots can play a relevant role to address Autistic Spectrum Disorder (ASD). This paper presents a humanoid social robot-assisted behavioral system based on a therapeutic multilevel treatment protocol customized to improve ...
- posterOctober 2016
Model-Driven Gaze Simulation for the Blind Person in Face-to-Face Communication
HAI '16: Proceedings of the Fourth International Conference on Human Agent InteractionPages 59–62https://doi.org/10.1145/2974804.2980482In face-to-face communication, eye gaze is integral to a conversation to supplement verbal language. The sighted often uses eye gaze to convey nonverbal information in social interactions, which a blind conversation partner cannot access and react to ...
- research-articleOctober 2015
E-Gaze: Create Gaze Communication for People with Visual Disability
HAI '15: Proceedings of the 3rd International Conference on Human-Agent InteractionPages 199–202https://doi.org/10.1145/2814940.2814974Gaze signals are frequently used by the sighted in social interactions as visual cues. However, these signals and cues are hardly accessible for people with visual disability. A conceptual design of E-Gaze glasses is proposed, assistive to create gaze ...
- research-articleOctober 2015
Spatial Communication and Recognition in Human-agent Interaction using Motion-parallax-based 3DCG Virtual Agent
HAI '15: Proceedings of the 3rd International Conference on Human-Agent InteractionPages 97–103https://doi.org/10.1145/2814940.2814954In this paper, we propose spatial communication between a virtual agent and a user through common space in both virtual world and real space. For this purpose, we propose the virtual agent system SCoViA, which renders a synchronized synthesis of the ...
- research-articleDecember 2014
Eye-to-eye contact for life-sized videoconferencing
OzCHI '14: Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of DesignPages 145–148https://doi.org/10.1145/2686612.2686632Videoconferencing systems available for end users do not allow for eye-to-eye contact between participants. The different locations of video camera and video display make it impossible to directly look into each others eyes. This issue is known as the ...
- research-articleNovember 2014
Analysis of Timing Structure of Eye Contact in Turn-changing
GazeIn '14: Proceedings of the 7th Workshop on Eye Gaze in Intelligent Human Machine Interaction: Eye-Gaze & MultimodalityPages 15–20https://doi.org/10.1145/2666642.2666648With the aim of constructing a model for predicting the next speaker and the start of the next utterance in multi-party meetings, we focus on the timing structure of the eye contact between the speaker, the listener, and the next speaker: who looks at ...