Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/1734454.1734471acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

A study of a retro-projected robotic face and its effectiveness for gaze reading by humans

Published: 02 March 2010 Publication History

Abstract

Reading gaze direction is important in human-robot interactions as it supports, among others, joint attention and non-linguistic interaction. While most previous work focuses on implementing gaze direction reading on the robot, little is known about how the human partner in a human-robot interaction is able to read gaze direction from a robot. The purpose of this paper is twofold: (1) to introduce a new technology to implement robotic face using retro-projected animated faces and (2) to test how well this technology supports gaze reading by humans. We briefly discuss the robot design and discuss parameters influencing the ability to read gaze direction. We present an experiment assessing the user's ability to read gaze direction for a selection of different robotic face designs, using an actual human face as baseline. Results indicate that it is hard to recreate human-human interaction performance. If the robot face is implemented as a semi sphere, performance is worst. While robot faces having a human-like physiognomy and, perhaps surprisingly, video projected on a flat screen perform equally well and seem to suggest that these are the good candidates to implement joint attention in HRI.

References

[1]
M. DeBoer and A. M. Boxer, "Signal functions of infant facial expression and gaze direction during mother-infant face-to-face play" Child Development, vol. 50, no. 4, pp. 1215--1218, 1979.
[2]
S. R. H. Langton, R. J. Watt, and V. Bruce, "Do the eyes have it? cues to the direction of social attention," Trends in Cognitive Sciences, vol. 4, no. 2, pp. 50--59, 2000.
[3]
F. Kaplan and V. Hafner, "The challenges of joint attention", Interaction Studies, pp. 67--74, 2004.
[4]
Y. Nagai, M. Asada, and K. Hosoda, "Learning for joint attention helped by functional development," Advanced Robotics, vol. 20, pp. 1165--1181(17), October 2006.
[5]
R. Atienza and A. Zelinsky," Active gaze tracking for human-robot interaction," Multimodal Interfaces, IEEE International Conference on Multimodal Interfaces, p. 261, 2002.
[6]
D. H. Yoo and M. J. Chung, "A novel non-intrusive eye gaze estimation using cross-ratio under large head motion," Computer Vision and Image Understanding, vol. 98, no. 1, pp. 25--51,2005, special Issue on Eye Detection and Tracking.
[7]
J. Ruiz-Del-Solar and P. Loncomilla, "Robot head pose detection and gaze direction determination using local invariant features," Advanced Robotics, vol. 23, no. 3, pp. 305--328, 2000.
[8]
D. W. Hansen and Q. Ji, "In the eye of the beholder: A survey of models for eyes and gaze," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 99, no. 1, 5555.
[9]
Y. Yoshikawa, K. Shinozawa, H. Ishiguro, N. Hagita, and T. Miyamoto, "Responsive robot gaze to interaction partner," in In Proceedings of robotics: Science and systems, 2006.
[10]
D. Miyauchi, A. Nakamura, and Y. Kuno, "Bidirectional eye contact for human-robot communication," IEICE - Trans. Inf. Syst., vol. E88-D, no. 11, pp. 2509--2516, 2005.
[11]
D. Miyauchi, A. Sakurai, A. Nakamura, and Y. Kuno, "Active eye contact for human-robot communication," in CHI '04: CHI '04 extended abstracts on Human factors in computing systems. New York, NY, USA: ACM, 2004, pp. 1099--1102.
[12]
A. Picot, G. Bailly, F. Elisei, and S. Raidt, "Scrutinizing natural scenes: Controlling the gaze of an embodied conversational agent," in Proceedings of Intelligent Virtual Agents, 7th International Conference, IVA 2007, Paris, France, 2007, pp. 272--282.
[13]
T. Kishimoto, Y. Shizawaa, J. Yasudaa, T. Hinobayashia, and T. Minamia, "Gaze following among toddlers," Infant Behavior and Development, vol. 31, pp. 280--286, 2008.
[14]
M. Tomasello, M. Carpenter, J. Call, T. Behne, and H. Moll, "Understanding and sharing intentions: The origins of cultural cognition," Behavioral and Brain Sciences, vol. 28, no. 5, pp. 675--691, 2005.
[15]
A. McCarthy, K. Lee, S. Itakura, and D. W. Muir, "Cultural display rules drive eye gaze during thinking," Journal of Cross-Cultural Psychology, vol. 37, no. 6, pp. 717--722, 2006.
[16]
F. Delaunay, J. de Greeff, and T. Belpaeme, "Towards retro-projected robot faces: an alternative to mechatronic and android faces," in Proceedings of the IEEE Ro-Man 2009 conference, Toyama, Japan. IEEE, 2009.
[17]
M. Hashimoto and H. Kondo, "Effect of emotional expression to gaze guidance using a face robot," in Proceedings of the 17th IEEE International Symposium on Robot Human Interactive Communication (RoMan 2008), Y. Tamatsu, Ed., 2007, p. 95--101.
[18]
H. Kobayashi and S. Kohshima, "Unique morphology of the human eye," Nature, vol. 387, no. 6635, pp. 767--768, 1997.
[19]
A. Kendon, "Some functions of gaze direction in social interaction," Acta Psychologica, vol. 26, pp. 22--63, 1967.

Cited By

View all
  • (2021)Cognitive Impact of Anthropomorphized Robot GazeACM Transactions on Human-Robot Interaction10.1145/345999410:4(1-14)Online publication date: 14-Jul-2021
  • (2020)Effects of Different Interaction Contexts when Evaluating Gaze Models in HRIProceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3319502.3374810(131-139)Online publication date: 9-Mar-2020
  • (2019)Animation Techniques in Human-Robot Interaction User StudiesACM Transactions on Human-Robot Interaction10.1145/33173258:2(1-22)Online publication date: 3-Jun-2019
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '10: Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
March 2010
400 pages
ISBN:9781424448937

Sponsors

Publisher

IEEE Press

Publication History

Published: 02 March 2010

Check for updates

Author Tags

  1. eye gaze
  2. human-robot interaction
  3. joint attention
  4. robotic face

Qualifiers

  • Research-article

Conference

HRI 10
Sponsor:

Acceptance Rates

HRI '10 Paper Acceptance Rate 26 of 124 submissions, 21%;
Overall Acceptance Rate 268 of 1,124 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)0
Reflects downloads up to 22 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Cognitive Impact of Anthropomorphized Robot GazeACM Transactions on Human-Robot Interaction10.1145/345999410:4(1-14)Online publication date: 14-Jul-2021
  • (2020)Effects of Different Interaction Contexts when Evaluating Gaze Models in HRIProceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3319502.3374810(131-139)Online publication date: 9-Mar-2020
  • (2019)Animation Techniques in Human-Robot Interaction User StudiesACM Transactions on Human-Robot Interaction10.1145/33173258:2(1-22)Online publication date: 3-Jun-2019
  • (2018)OmniGazeProceedings of the 6th International Conference on Human-Agent Interaction10.1145/3284432.3284439(176-183)Online publication date: 4-Dec-2018
  • (2017)Social eye gaze in human-robot interactionJournal of Human-Robot Interaction10.5898/JHRI.6.1.Admoni6:1(25-63)Online publication date: 26-May-2017
  • (2017)ThirdEyeProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025681(5307-5312)Online publication date: 2-May-2017
  • (2016)Representing Gaze Direction in Video Communication Using Eye-Shaped DisplayAdjunct Proceedings of the 29th Annual ACM Symposium on User Interface Software and Technology10.1145/2984751.2985705(65-67)Online publication date: 16-Oct-2016
  • (2016)Investigating Text Legibility on Non-Rectangular DisplaysProceedings of the 2016 CHI Conference on Human Factors in Computing Systems10.1145/2858036.2858057(498-508)Online publication date: 7-May-2016
  • (2015)Study on Gaze Direction Perception of Face Image Displayed on Rotatable Flat DisplayProceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems10.1145/2702123.2702369(1729-1737)Online publication date: 18-Apr-2015
  • (2015)Museum Guide Robot by Considering Static and Dynamic Gaze Expressions to Communicate with VisitorsProceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts10.1145/2701973.2702011(125-126)Online publication date: 2-Mar-2015
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media