Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Exploring the potential of a mobile eye tracker as an intuitive indoor pointing device

Published: 01 April 2018 Publication History

Abstract

Current technology offers a variety of ways for context-aware information delivery to mobile users. The most challenging aspect, however, is to determine what the user is interested in. The users position is the best available hint, but if we know what the user is looking at and what his or her gazing profile is, we can narrow down the possibly relevant objects of interest. With the advent of mobile and ubiquitous computing, it is time to explore the potential of mobile eye tracking technology for natural, intelligent interactions between users and their smart environment, not only for specific tasks, but also for the more ambitious goal of integrating eye tracking into the process of inferring mobile users interests, for the purpose of providing them with relevant services, a research area that has received little attention so far.In this work, we examine the potential of integrating a mobile eye tracker, as a natural interaction device, into an audio guide system for museum visitors. Using it as a pointing device enables the system to reason unobtrusively about the users focus of attention and to deliver relevant information about it as needed. To realize this goal, we integrated an image-matching based technique for indoor positioning and an eye-gaze detection technique to identify the users focus of attention into two different versions of a mobile audio guide: (1) a proactive version that delivers information automatically whenever user interest is detected, and (2) a reactive version that notifies the user about the availability of this information, thus giving the user more control over information delivery. Furthermore, we developed a conventional museum visitors mobile guide system using a smartphone and low-energy Bluetooth beacons for positioning; this guide was used as a reference system.The three museum visitors guides were evaluated in realistic settings at the Hecht11http://mushecht.haifa.ac.il/Default_eng.aspx.Museum, a small museum, located at the University of Haifa that has both archeological and art collections. The experimental evaluation compared the contribution of the three versions of the audio guide to the visit experience. The results showed that the mobile eye tracking technology, although unfamiliar, and perhaps even immature, was accepted by the participants. The mobile eye tracker audio guide was perceived as preferable to the conventional museum mobile guide, especially with regard to learning during the visit. Furthermore, with regard to proactivity in context-aware systems, the results showed that the participants like to be in control, and that most of them preferred the reactive version of the mobile eye tracker audio guide over the proactive one. Analysis and examination of the potential use of mobile eye tracker in a museum is presented.A mobile museum visitors guide that uses a mobile eye tracker as a pointing device is described.A user study comparing the use of a museum visitors guide that uses an eye tracker and a conventional one is presented.

References

[1]
A.A. Calvo, S. Perugini, Pointing devices for wearable computers, Adv. HumanComput. Interact., 2014 (2014) 10.
[2]
A. Bulling, T.O. Zander, Cognition-aware computing, IEEE Pervasive Comput., 13 (2014) 80-83.
[3]
A. Bulling, R. Dachselt, A. Duchowski, R. Jacob, S. Stellmach, V. Sundstedt, Gaze interaction in the post-WIMP world, in: Extended Abstracts on Human Factors in Computing Systems, ACM, 2012, pp. 1221-1224.
[4]
R.J.K. Jacob, K.S. Karn, Elsevier Science BV, 2003.
[5]
P.M. Fitts, R.E. Jones, J.L. Milton, Eye movements of aircraft pilots during instrument-landing approaches, Aeronaut. Eng. Rev., 9 (1950) 24-29.
[6]
K. Hendrickson, K.L. Ailawadi, Six lessons for in-store marketing from six years of mobile eye-tracking research. Shopper marketing and the role of in-store marketing, Rev. Mark. Res., 11 (2014) 57-74.
[7]
M. Kassner, W. Patera, A. Bulling, Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction, in: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, ACM, 2014, pp. 1151-1160.
[8]
R.J.K. Jacob, The use of eye movements in humancomputer interaction techniques: What you look at is what you get, ACM Trans. Inf. Syst., 9 (1991) 152-169.
[9]
L. Ardissono, T. Kuflik, D. Petrelli, Personalization in cultural heritage: the road travelled and the one ahead, User Model. User-Adapt. Interact., 22 (2012) 73-99.
[10]
S. Stephens, The growth of mobile apps, Mus. Pract. (2010).
[11]
S. Billings, Upwardly mobile, Mus. Pract., 46 (2009) 30-34.
[12]
K. Cheverst, N. Davies, K. Mitchell, A. Friday, Experiences of developing and deploying a context-aware tourist guide: The GUIDE Project, in: Proc. 6th Annu. Int. Conf. Mobile Comput. Netw., ACM Press, New York, 2000, pp. 20-31.
[13]
D.W. Hansen, Q. Ji, In the eye of the beholder: A survey of models for eyes and gaze, IEEE Trans. Pattern Anal. Mach. Intell., 32 (2010) 478-500.
[14]
M.V. Yousefi, E.P. Karan, A. Mohammadpour, S. Asadi, Implementing eye tracking technology in the construction process, in: 51st ASC Annual International Conference Proceedings, 2015.
[15]
D.G. Lowe, Object recognition from local scale-invariant features, in: Proceedings of the Seventh IEEE International Conference on Computer Vision, Vol. 2, 1999, pp. 11501157.
[16]
H. Bay, T. Tuytelaars, L. Van Gool, SURF: Speeded up robust features, in: Computer VisionECCV, 2006, pp. 404417.
[17]
S. Leutenegger, M. Chli, R.Y. Siegwart, BRISK: Binary robust invariant scalable keypoints, in: Computer Vision, ICCV, 2011 pp. 25482555.
[18]
E. Rublee, V. Rabaud, K. Konolige, G.R. Bradski, ORB: An efficient alternative to SIFT or SURF, in: Computer Vision, ICCV, 2011, pp. 25642571.
[19]
M.A. Fischler, R.C. Bolles, Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography, Commun. ACM, 24 (1981) 381-395.
[20]
O. Chum, J. Matas, J. Kittler, Locally optimized RANSAC, in: Patt. Recog., 2003, pp. 236-243.
[21]
K. Lebeda, J. Matas, O. Chum, Fixing the locally optimized RANSAC, in: British Machine Vision Conference, 2012, pp. 111.
[22]
O. Chum, J. Matas, Matching with PROSAC- progressive sample consensus, in: Proc. IEEE Conf. Comp. Vision Patt. Recog, Vol. I, 2005, pp. 220226.
[23]
A. Brahmachari, S. Sarkar, Hop-diffusion Monte Carlo for epipolar geometry estimation between very wide-baseline images, IEEE Trans. Pattern Anal. Mach. Intell., 35 (2013) 755-762.
[24]
L. Goshen, I. Shimshoni, Balanced exploration and exploitation model search for efficient epipolar geometry estimation, IEEE Trans. Pattern Anal. Mach. Intell., 30 (2008) 1230-1242.
[25]
R. Raguram, O. Chum, M. Pollefeys, J. Matas, J.M. Frahm, USAC: a universal framework for random sample consensus, IEEE Trans. Pattern Anal. Mach. Intell., 35 (2013) 2022-2038.
[26]
B. Tordoff, D. Murray, Guided sampling and consensus for motion estimation, in: European Conference on Computer Vision, 2002, pp. 8298.
[27]
A. Oliva, A. Torralba, Building the gist of a scene: The role of global image features in recognition, Prog. Brain Res., 155 (2006) 23-36.
[28]
B. Zhou, A. Lapedriza, J. Xiao, A. Torralba, A. Oliva, Learning deep features for scene recognition using places database, in: Advances in Neural Information Processing Systems, 2014, pp. 487-495.
[29]
M. Brown, S. Ssstrunk, Multi-spectral SIFT for scene category recognition, in: IEEE Conference on Computer Vision and Pattern Recognition, CVPR, 2011, pp. 177184.
[30]
M. Economou, The evaluation of museum multimedia applications: lessons from research, Mus. Manag. Curatorship, 17 (1998) 173-187.
[31]
M. Weiser, The computer for the 21st century, Sci. Am., 265 (1991) 94-104.
[32]
P. Prekop, Paul, M. Mark Burnett, Activities, context and ubiquitous computing, Comput. Commun., 26 (2003) 1168-1176.
[33]
K. Huang, Challenges in humancomputer interaction design for mobile devices, in: Proceedings of the World Congress on Engineering and Computer Science, Vol. 1, 2009, pp. 2022.
[34]
G. Brne, B. Oben, T. Goedem, Towards a more effective method for analyzing mobile eye-tracking data: integrating gaze data with object recognition algorithms, in: Proceedings of the 1st International Workshop on Pervasive Eye Tracking & Mobile Eye-based Interaction, 2011, pp. 5356.
[35]
T. Pfeiffer, P. Renner, Eyesee3d: A low-cost approach for analyzing mobile 3D eye tracking data using computer vision and augmented reality technology, in: Proceedings of the Symposium on Eye Tracking Research and Applications, 2014, pp. 369376.
[36]
C. Ohm, M. Mller, B. Ludwig, S. Bienk, Where is the Landmark? Eye Tracking Studies in Large-Scale Indoor Environments, 2014, pp. 4751.
[37]
S. De Beugher, G. Brne, T. Goedem, Automatic analysis of in-the-wild mobile eye-tracking experiments using object, face and person detection, in: Proceedings of the International Conference on Computer Vision Theory and Applications, VISIGRAPP 2014, Vol. 1, 2014, pp. 625633.
[38]
J. Schrammel, E. Mattheiss, S. Dbelt, L. Paletta, A. Almer, M. Tscheligi, Attentional behavior of users on the move towards pervasive advertising media, in: Pervasive Advertising, Springer, London, 2011, pp. 287-307.
[39]
J. Schrammel, G. Regal, M. Tscheligi, Attention approximation of mobile users towards their environment, in: CHI14 Extended Abstracts on Human Factors in Computing Systems, 2014, pp. 17231728.
[40]
P. Kiefer, I. Giannopoulos, D. Kremer, C. Schlieder, M. Raubal, Starting to get bored: An outdoor eye tracking study of tourists exploring a city panorama, in: Proceedings of the Symposium on Eye Tracking Research and Applications, 2014, pp. 315318.
[41]
Y.I. Nakano, R. Ishii, Estimating users engagement from eye-gaze behaviors in human-agent conversations, in: Proceedings of the 15th International Conference on Intelligent User Interfaces, 2010, pp. 139148.
[42]
K.T. Ma, Q. Xu, L. Li, T. Sim, M. Kankanhalli, R. Lim, Eye-2-I: Eye-tracking for just-in-time implicit user profiling. 2015. arXiv preprint arXiv:1507.04441.
[43]
A. Hampapur, K. Hyun, R.M. Bolle, Comparison of sequence matching techniques for video copy detection, in: Electronic Imaging, 2002, pp. 194-201.
[44]
T. Toyama, T. Kieninger, F. Shafait, A. Dengel, Gaze guided object recognition using a head-mounted eye tracker, in: Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, New York, NY, USA, 2012, pp. 91-98.
[45]
E. Dim, T. Kuflik, Automatic detection of social behavior of museum visitor pairs, ACM Trans. Interact. Intell. Syst. (TiiS), 4 (2014) 17.
[46]
A. Bulling, H. Gellersen, Toward mobile eye-based humancomputer interaction, IEEE Pervasive Comput., 9 (2010) 8-12.
[47]
I. Giannopoulos, J. Schning, A. Krger, M. Raubal, Attention as an input modality for Post-WIMP interfaces using the viGaze eye tracking framework, Multimedia Tools Appl., 75 (2016) 2913-2929.
[48]
S.S. Yalowitz, K. Bronnenkant, Timing and tracking: unlocking visitor behavior, Visitor Stud., 12 (2009) 47-64.
[49]
J.H. Falk, John L.D. Dierking, Altamira Press, 2000.
[50]
G. Farnebck, Two-frame motion estimation based on polynomial expansion, in: LNCS, vol. 2749, Springer, Berlin Heidelberg, 2003, pp. 363-370.
[51]
S.K. Card, G.G. Robertson, J.D. Mackinlay, The information visualizer: An information workspace, in: Proc. ACM CHI91 Conf., 1991, pp. 181188.
[52]
N. Newman, Apple iBeacon technology briefing, J. Direct Data Digit. Mark. Pract., 15 (2014) 222-225.
[53]
B.D. Lucas, T. Kanade, An iterative image registration technique with an application to stereo vision, in: Proceedings of Imaging Understanding Workshop, 1981, pp. 121130.
[54]
J. Brooke, SUS-A quick and dirty usability scale, Usability Eval. Ind., 189 (1996) 4-7.
[55]
J. Corbin, A. Strauss, Sage, 2015.
[56]
J. Lanir, T. Kuflik, A.J. Wecker, O. Stock, M. Zancanaro, Examining proactiveness and choice in a location-aware mobile museum guide, Interact. Comput., 23 (2011) 513-524.
[57]
M. Cohen, I. Shimshoni, E. Rivlin, A. Adam, Detecting mutual awareness events, IEEE Trans. Pattern Anal. Mach. Intell., 34 (2012) 2327-2340.
[58]
T. Kuflik, J. Lanir, E. Dim, A. Wecker, M. Corra, M. Zancanaro, O. Stock, Indoor positioning in cultural heritage: Challenges and a solution, in: Electrical & Electronics Engineers in Israel (IEEEI), 2012 IEEE 27th Convention of, 2012, pp. 15, IEEE.
[59]
J. Lanir, T. Kuflik, E. Dim, A.J. Wecker, O. Stock, The influence of a location-aware mobile guide on museum visitors behavior, Interact. Comput., 25 (2013) 443-460.
[60]
I. Beja, J. Lanir, T. Kuflik, Examining factors influencing the disruptiveness of notifications in a mobile museum context, Hum.Comput. Interact., 30 (2015) 433-472.
[61]
P. Majaranta, A. Bulling, Eye tracking and eye-based humancomputer interaction, in: Advances in Physiological Computing, Springer, London, 2014, pp. 39-65.

Cited By

View all
  • (2024)Artwork Segmentation in Eye-Tracking Experiments: Challenges and Future DirectionsAdjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3631700.3664906(477-481)Online publication date: 27-Jun-2024
  • (2023)Eyeing the Visitor’s Gaze for Artwork RecommendationAdjunct Proceedings of the 31st ACM Conference on User Modeling, Adaptation and Personalization10.1145/3563359.3596670(374-378)Online publication date: 26-Jun-2023
  • (2023)How Can We Set Up Eye Trackers in a Real Classroom? Using Mobile Eye Trackers to Record Learners’ Visual Attention During Learning Statistical Graphs with Different Complex LevelsInnovative Technologies and Learning10.1007/978-3-031-40113-8_31(315-325)Online publication date: 28-Aug-2023
  • Show More Cited By
  1. Exploring the potential of a mobile eye tracker as an intuitive indoor pointing device

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Future Generation Computer Systems
      Future Generation Computer Systems  Volume 81, Issue C
      April 2018
      580 pages

      Publisher

      Elsevier Science Publishers B. V.

      Netherlands

      Publication History

      Published: 01 April 2018

      Author Tags

      1. Mobile eye tracking
      2. Museum visitors guide

      Qualifiers

      • Research-article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 24 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Artwork Segmentation in Eye-Tracking Experiments: Challenges and Future DirectionsAdjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3631700.3664906(477-481)Online publication date: 27-Jun-2024
      • (2023)Eyeing the Visitor’s Gaze for Artwork RecommendationAdjunct Proceedings of the 31st ACM Conference on User Modeling, Adaptation and Personalization10.1145/3563359.3596670(374-378)Online publication date: 26-Jun-2023
      • (2023)How Can We Set Up Eye Trackers in a Real Classroom? Using Mobile Eye Trackers to Record Learners’ Visual Attention During Learning Statistical Graphs with Different Complex LevelsInnovative Technologies and Learning10.1007/978-3-031-40113-8_31(315-325)Online publication date: 28-Aug-2023
      • (2022)ARIDF: Automatic Representative Image Dataset Finder for Image Based LocalizationAdjunct Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization10.1145/3511047.3537661(383-390)Online publication date: 4-Jul-2022
      • (2022)Context Awareness in Cultural Heritage Applications: A SurveyJournal on Computing and Cultural Heritage 10.1145/348095315:2(1-31)Online publication date: 7-Apr-2022
      • (2021)Exploring Potential Gestures for Controlling an Eye-Tracker Based SystemProceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia10.1145/3490632.3497836(211-213)Online publication date: 5-Dec-2021
      • (2020)EyeLinks: Methods to compute reliable stereo mappings used for eye gaze trackingACM Symposium on Eye Tracking Research and Applications10.1145/3379156.3391354(1-5)Online publication date: 2-Jun-2020
      • (2020)Enhancing cultural heritage outdoor experience with augmented-reality smart glassesPersonal and Ubiquitous Computing10.1007/s00779-020-01366-724:6(873-886)Online publication date: 18-Jan-2020
      • (2019)A cognition-centered personalization framework for cultural-heritage contentUser Modeling and User-Adapted Interaction10.1007/s11257-019-09226-729:1(9-65)Online publication date: 1-Mar-2019

      View Options

      View options

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media