Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2401836acmconferencesBook PagePublication Pagesicmi-mlmiConference Proceedingsconference-collections
Gaze-In '12: Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
ACM2012 Proceeding
Publisher:
  • Association for Computing Machinery
  • New York
  • NY
  • United States
Conference:
ICMI '12: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION Santa Monica California 26 October 2012
ISBN:
978-1-4503-1516-6
Published:
26 October 2012
Sponsors:

Reflects downloads up to 12 Sep 2024Bibliometrics
Skip Abstract Section
Abstract

In the previous workshops, we have discussed a wide range of issues for eye gaze; technologies for sensing human attentional behaviors, roles of attentional behaviors as social gaze in human-human and human-humanoid interaction, attentional behaviors in problem-solving and task-performing, gaze-based intelligent user interfaces, and evaluation of gaze-based UI. In addition to these topics, this year's workshop focuses on eye gaze in multimodal interpretation and generation. Since eye gaze is one of the facial communication modalities, gaze information can be combined with other modalities or bodily motions to contribute to the meaning of utterance and serve as communication signals. This workshop aims to continue exploring this growing area of research by bringing together researchers including human sensing, multimodal processing, humanoid interfaces, intelligent user interface, and communication science. We will exchange ideas to develop and improve methodologies for this research area with the long term goal of establishing a strong interdisciplinary research community in "attention aware interactive systems".

Skip Table Of Content Section
research-article
Brain-enhanced synergistic attention (BESA)
Article No.: 1, Pages 1–7https://doi.org/10.1145/2401836.2401837

In this paper, we describe a hybrid human-machine system for searching and detecting Objects of Interest (OI) in imagery. Automated methods for OI detection based on models of human visual attention have received much interest, but are inherently bottom-...

research-article
Multi-modal object of interest detection using eye gaze and RGB-D cameras
Article No.: 2, Pages 1–6https://doi.org/10.1145/2401836.2401838

This paper presents a low-cost, wearable headset for mobile 3D Point of Gaze (PoG) estimation in assistive applications. The device consists of an eye tracking camera and forward facing RGB-D scene camera which are able to provide an estimate of the ...

research-article
Perception of gaze direction for situated interaction
Article No.: 3, Pages 1–6https://doi.org/10.1145/2401836.2401839

Accurate human perception of robots' gaze direction is crucial for the design of a natural and fluent situated multimodal face-to-face interaction between humans and machines. In this paper, we present an experiment targeted at quantifying the effects ...

research-article
A head-eye coordination model for animating gaze shifts of virtual characters
Article No.: 4, Pages 1–6https://doi.org/10.1145/2401836.2401840

We present a parametric, computational model of head-eye coordination that can be used in the animation of directed gaze shifts for virtual characters. The model is based on research in human neurophysiology. It incorporates control parameters that ...

research-article
From the eye to the heart: eye contact triggers emotion simulation
Article No.: 5, Pages 1–7https://doi.org/10.1145/2401836.2401841

Smiles are complex facial expressions that carry multiple meanings. Recent literature suggests that deep processing of smiles via embodied simulation can be triggered by achieved eye contact. Three studies supported this prediction. In Study 1, ...

research-article
Addressee identification for human-human-agent multiparty conversations in different proxemics
Article No.: 6, Pages 1–6https://doi.org/10.1145/2401836.2401842

This paper proposes a method for identifying the addressee based on speech and gaze information, and shows that the proposed method can be applicable to human-human-agent multiparty conversations in different proxemics. First, we collected human-human-...

research-article
Hard lessons learned: mobile eye-tracking in cockpits
Article No.: 7, Pages 1–6https://doi.org/10.1145/2401836.2401843

Eye-tracking presents an attractive tool in testing of design alternatives in all stages of interface evaluation. Access to the operator's visual attention behaviors provides information supporting design decisions. While mobile eye-tracking increases ...

research-article
Analysis on learners' gaze patterns and the instructor's reactions in ballroom dance tutoring
Article No.: 8, Pages 1–6https://doi.org/10.1145/2401836.2401844

The use of virtual conversational agents is awaited in the tutoring of physical skills such as sports or dances. This paper describes about an ongoing project aiming to realize a virtual instructor for ballroom dance. First, a human-human experiment is ...

research-article
Multimodal corpus of conversations in mother tongue and second language by same interlocutors
Article No.: 9, Pages 1–5https://doi.org/10.1145/2401836.2401845

We describe data on multi-modal information that were collected from conversations both in the mother tongue and the second language in this paper. We also compare eye movements and utterance styles between communications in the mother tongue and second ...

research-article
Gaze and conversational engagement in multiparty video conversation: an annotation scheme and classification of high and low levels of engagement
Article No.: 10, Pages 1–6https://doi.org/10.1145/2401836.2401846

When using a multiparty video mediated system, interacting participants assume a range of various roles and exhibit behaviors according to how engaged in the communication they are. In this paper we focus on estimation of conversational engagement from ...

research-article
Visual interaction and conversational activity
Article No.: 11, Pages 1–6https://doi.org/10.1145/2401836.2401847

In addition to the contents of their speech, people who are engaged in a conversation express themselves in many nonverbal ways. This means that people interact and are attended to even when they are not speaking. In this pilot study, we created an ...

research-article
Move it there, or not?: the design of voice commands for gaze with speech
Article No.: 12, Pages 1–3https://doi.org/10.1145/2401836.2401848

This paper presents an experiment that was conducted to investigate gaze combined with voice commands. There has been very little research about the design of voice commands for this kind of input. It is not known yet if users prefer longer sentences ...

research-article
Eye gaze assisted human-computer interaction in a hand gesture controlled multi-display environment
Article No.: 13, Pages 1–3https://doi.org/10.1145/2401836.2401849

A special human-computer interaction (HCI) framework processing user input in a multi-display environment has the ability to detect and interpret dynamic hand gesture input. In an environment equipped with large displays, full contactless application ...

research-article
A framework of personal assistant for computer users by analyzing video stream
Article No.: 14, Pages 1–3https://doi.org/10.1145/2401836.2401850

The engagement time on the computer is increasing steadily with the rapid development of the Internet. During the long period in front of the computer, bad postures and habits will result in some health risks, and the unawareness of fatigue will impair ...

research-article
Simple multi-party video conversation system focused on participant eye gaze: "Ptolemaeus" provides participants with smooth turn-taking
Article No.: 15, Pages 1–3https://doi.org/10.1145/2401836.2401851

This paper shows a prototype system that provides a natural multi-party conversation environment among participants in different places. Eye gaze is an important feature for maintaining smooth multi-party conversations because it indicates whom the ...

research-article
Sensing visual attention using an interactive bidirectional HMD
Article No.: 16, Pages 1–3https://doi.org/10.1145/2401836.2401852

This paper presents a novel system for sensing of attentional behavior in Augmented Reality (AR) environments by analyzing eye movement. The system is based on light weight head mounted optical see-through glasses containing bidirectional microdisplays, ...

research-article
Semantic interpretation of eye movements using designed structures of displayed contents
Article No.: 17, Pages 1–3https://doi.org/10.1145/2401836.2401853

This paper presents a novel framework to interpret eye movements using semantic relations and spatial layouts of displayed contents, i.e., the designed structure. We represent eye movements in a multi-scale, interval-based manner and associate them with ...

research-article
A communication support interface based on learning awareness for collaborative learning
Article No.: 18, Pages 1–3https://doi.org/10.1145/2401836.2401854

The development of information communication technologies allows learners to study together with others through networks. To realize successful collaborative learning in such distributed environments, supporting their communication is important because ...

Recommendations

Acceptance Rates

Overall Acceptance Rate 19 of 21 submissions, 90%
YearSubmittedAcceptedRate
GazeIn '1488100%
GazeIn '13131185%
Overall211990%