In the previous workshops, we have discussed a wide range of issues for eye gaze; technologies for sensing human attentional behaviors, roles of attentional behaviors as social gaze in human-human and human-humanoid interaction, attentional behaviors in problem-solving and task-performing, gaze-based intelligent user interfaces, and evaluation of gaze-based UI. In addition to these topics, this year's workshop focuses on eye gaze in multimodal interpretation and generation. Since eye gaze is one of the facial communication modalities, gaze information can be combined with other modalities or bodily motions to contribute to the meaning of utterance and serve as communication signals. This workshop aims to continue exploring this growing area of research by bringing together researchers including human sensing, multimodal processing, humanoid interfaces, intelligent user interface, and communication science. We will exchange ideas to develop and improve methodologies for this research area with the long term goal of establishing a strong interdisciplinary research community in "attention aware interactive systems".
Brain-enhanced synergistic attention (BESA)
In this paper, we describe a hybrid human-machine system for searching and detecting Objects of Interest (OI) in imagery. Automated methods for OI detection based on models of human visual attention have received much interest, but are inherently bottom-...
Multi-modal object of interest detection using eye gaze and RGB-D cameras
This paper presents a low-cost, wearable headset for mobile 3D Point of Gaze (PoG) estimation in assistive applications. The device consists of an eye tracking camera and forward facing RGB-D scene camera which are able to provide an estimate of the ...
Perception of gaze direction for situated interaction
Accurate human perception of robots' gaze direction is crucial for the design of a natural and fluent situated multimodal face-to-face interaction between humans and machines. In this paper, we present an experiment targeted at quantifying the effects ...
A head-eye coordination model for animating gaze shifts of virtual characters
We present a parametric, computational model of head-eye coordination that can be used in the animation of directed gaze shifts for virtual characters. The model is based on research in human neurophysiology. It incorporates control parameters that ...
From the eye to the heart: eye contact triggers emotion simulation
Smiles are complex facial expressions that carry multiple meanings. Recent literature suggests that deep processing of smiles via embodied simulation can be triggered by achieved eye contact. Three studies supported this prediction. In Study 1, ...
Addressee identification for human-human-agent multiparty conversations in different proxemics
This paper proposes a method for identifying the addressee based on speech and gaze information, and shows that the proposed method can be applicable to human-human-agent multiparty conversations in different proxemics. First, we collected human-human-...
Hard lessons learned: mobile eye-tracking in cockpits
Eye-tracking presents an attractive tool in testing of design alternatives in all stages of interface evaluation. Access to the operator's visual attention behaviors provides information supporting design decisions. While mobile eye-tracking increases ...
Analysis on learners' gaze patterns and the instructor's reactions in ballroom dance tutoring
The use of virtual conversational agents is awaited in the tutoring of physical skills such as sports or dances. This paper describes about an ongoing project aiming to realize a virtual instructor for ballroom dance. First, a human-human experiment is ...
Multimodal corpus of conversations in mother tongue and second language by same interlocutors
We describe data on multi-modal information that were collected from conversations both in the mother tongue and the second language in this paper. We also compare eye movements and utterance styles between communications in the mother tongue and second ...
Gaze and conversational engagement in multiparty video conversation: an annotation scheme and classification of high and low levels of engagement
When using a multiparty video mediated system, interacting participants assume a range of various roles and exhibit behaviors according to how engaged in the communication they are. In this paper we focus on estimation of conversational engagement from ...
Visual interaction and conversational activity
In addition to the contents of their speech, people who are engaged in a conversation express themselves in many nonverbal ways. This means that people interact and are attended to even when they are not speaking. In this pilot study, we created an ...
Move it there, or not?: the design of voice commands for gaze with speech
This paper presents an experiment that was conducted to investigate gaze combined with voice commands. There has been very little research about the design of voice commands for this kind of input. It is not known yet if users prefer longer sentences ...
Eye gaze assisted human-computer interaction in a hand gesture controlled multi-display environment
A special human-computer interaction (HCI) framework processing user input in a multi-display environment has the ability to detect and interpret dynamic hand gesture input. In an environment equipped with large displays, full contactless application ...
A framework of personal assistant for computer users by analyzing video stream
The engagement time on the computer is increasing steadily with the rapid development of the Internet. During the long period in front of the computer, bad postures and habits will result in some health risks, and the unawareness of fatigue will impair ...
Simple multi-party video conversation system focused on participant eye gaze: "Ptolemaeus" provides participants with smooth turn-taking
This paper shows a prototype system that provides a natural multi-party conversation environment among participants in different places. Eye gaze is an important feature for maintaining smooth multi-party conversations because it indicates whom the ...
Sensing visual attention using an interactive bidirectional HMD
This paper presents a novel system for sensing of attentional behavior in Augmented Reality (AR) environments by analyzing eye movement. The system is based on light weight head mounted optical see-through glasses containing bidirectional microdisplays, ...
Semantic interpretation of eye movements using designed structures of displayed contents
This paper presents a novel framework to interpret eye movements using semantic relations and spatial layouts of displayed contents, i.e., the designed structure. We represent eye movements in a multi-scale, interval-based manner and associate them with ...
A communication support interface based on learning awareness for collaborative learning
The development of information communication technologies allows learners to study together with others through networks. To realize successful collaborative learning in such distributed environments, supporting their communication is important because ...
Recommendations
4th workshop on eye gaze in intelligent human machine interaction: eye gaze and multimodality
ICMI '12: Proceedings of the 14th ACM international conference on Multimodal interactionThis is the fourth workshop in a series of workshops on Eye Gaze in Intelligent Human Machine Interaction, in which we have discussed a wide range of issues for eye gaze; technologies for sensing human attentional behaviors, roles of attentional ...
Acceptance Rates
Year | Submitted | Accepted | Rate |
---|---|---|---|
GazeIn '14 | 8 | 8 | 100% |
GazeIn '13 | 13 | 11 | 85% |
Overall | 21 | 19 | 90% |