Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3440054.3440058acmotherconferencesArticle/Chapter ViewAbstractPublication PagesbdsicConference Proceedingsconference-collections
research-article

Driver's Gaze Zone Estimation Method: A Four-channel Convolutional Neural Network Model

Published: 01 February 2021 Publication History

Abstract

Driver's gaze has become an important indicator to analysis driving state. By estimating the gaze zone of drivers, we can further judge their fatigue state and even predict their driving intention in the next step. In this paper, we propose a four-channel gaze estimation model based on Convolutional Neural Network (CNN), which is used to estimate the gaze zones of the driver. In the proposed method, the images of the right eye, the left eye, the face, and the head are used as the input data of the multi-channel CNN. Then, the features of different channels are fused to estimate the gaze zone. Finally, we compared our method with several existing methods, and the experimental results show that the accuracy of our method is 96%.

References

[1]
G. M. Fitch 2013. The impact of hand-held and hands-free cell phone use on driving performance and safety-critical event risk. National Highway Traffic Safety Administration, Washington, DC, WA, USA, Tech. Rep. DOT HS 811 757, Apr. 2013, 1-273.
[2]
Vora S, Rangesh A, Trivedi M M. 2018. Driver gaze zone estimation using convolutional neural networks: a general framework and ablative analysis. IEEE Transactions on Intelligent Vehicles, 1-1.
[3]
A. Doshi and M. M. Trivedi. 2011. Tactical driver behavior prediction and intent inference: A review. 2011 14 th International IEEE Conference on Intelligent Transportation Systems (ITSC), Washington, DC, 1892-1897.
[4]
L. M. Bergasa, J. Nuevo, M. A. Sotelo, R. Barea and M. E. Lopez. 2006. Real-time system for monitoring driver vigilance. In IEEE Transactions on Intelligent Transportation Systems, March 2006, vol. 7, no. 1, 63-77.
[5]
Qiang Ji and Xiaojie Yang. 2001. Real time visual cues extraction for monitoring driver vigilance. In Proceedings of the Second International Workshop on Computer Vision Systems (ICVS ‘01). Springer-Verlag, Berlin, Heidelberg, 107-124.
[6]
Qiang Ji and Xiaojie Yang. 2002. Real-time eye, gaze, and face pose tracking for monitoring driver vigilance. Real-Time Imaging 8, 5 (October 2002), 357-377.
[7]
Morimoto C H, Koons D, Amir A, 2007. Pupil detection and tracking using multiple light sources. Image & Vision Computing, 2007, 18(4):331-335.
[8]
Tawari A, Trivedi M M. 2014. Robust and continuous estimation of driver gaze zone by dynamic analysis of multiple face videos. Intelligent Vehicles Symposium. IEEE.
[9]
Lee S J, Jo J, Jung H G, 2011. Real-Time Gaze Estimator Based on driver's head orientation for forward collision warning system. IEEE Transactions on Intelligent Transportation Systems, 12(1). 254-267.
[10]
A. Tawari, K. H. Chen, and M. M. Trivedi. 2014. Where is the driver looking: Analysis of head, eye and iris for robust gaze zone estimation. 17 th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao. 988-994.
[11]
T. Ishikawa, S. Baker, I. Matthews, and T. Kanade. 2004. Passive driver gaze tracking with active appearance models. In Proc. World Congress Intell. Trans. Syst., Robot. Inst., Carnegie Mellon Univ., Pittsburgh, PA, USA, Tech. Rep., CMU-RI-TR-04-08, Feb. 2004.
[12]
P. Smith, M. Shah, and N. da Vitoria Lobo. 2003. Determining driver visual attention with one camera. Trans. Intell. Transport. Sys. 4, 4 (December 2003), 205–218.
[13]
Fridman L, Lee J, Reimer B, 2016. ‘Owl’ and ‘Lizard’: patterns of head pose and eye pose in driver gaze classification. IET computer vision. 10(4):308-313.
[14]
Fridman L, Langhans P, Lee J, 2016. Driver gaze estimation without using eye movement. IEEE Intelligent Systems. 31(3):49-56.
[15]
Deng H, Zhu W. 2017. Monocular free-head 3D gaze tracking with deep learning and geometry constraints. IEEE International Conference on Computer Vision (ICCV).
[16]
K. Krafka 2016. Eye tracking for everyone. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV. 2176-2184.
[17]
Tayibnapis I R, Choi M K, Kwon S. 2018. Driver's gaze zone estimation by transfer learning. 2018 IEEE International Conference on Consumer Electronics (ICCE).
[18]
Naqvi R A, Arsalan M, Batchuluun G 2018. Deep learning-based gaze detection system for automobile drivers using a NIR camera sensor. Sensors, 2018, 18(2):456.
[19]
T. He, Z. Zhang, H. Zhang, Z. Zhang, J. Xie and M. Li. 2019. Bag of tricks for image classification with convolutional neural networks. 2019 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA. 558-567.
[20]
Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton. 2012. ImageNet classification with deep convolutional neural networks. In Proceedings of the 25 th International Conference on Neural Information Processing Systems (NIPS’12), NY, USA, 1097-1105.
[21]
F. N. Iandola, S. Han, M. W. Moskewicz, K. Ashraf, W. J. Dally, and K. Keutzer. 2016. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5 MB model size. arXiv:1602.07360.

Cited By

View all
  • (2024)Fine-grained gaze estimation based on the combination of regression and classification lossesApplied Intelligence10.1007/s10489-024-05778-3Online publication date: 3-Sep-2024
  • (2022)An Analytical Model for Estimating Average Driver Attention Based on the Visual Field2022 7th International Conference on Signal and Image Processing (ICSIP)10.1109/ICSIP55141.2022.9886244(586-590)Online publication date: 20-Jul-2022
  • (2022)Young Driver Gaze (YDGaze): Dataset for driver gaze analysis2022 International Conference on Artificial Intelligence of Things (ICAIoT)10.1109/ICAIoT57170.2022.10121856(1-6)Online publication date: 29-Dec-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
BDSIC '20: Proceedings of the 2020 2nd International Conference on Big-data Service and Intelligent Computation
December 2020
69 pages
ISBN:9781450388399
DOI:10.1145/3440054
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 February 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Gaze estimation
  2. convolution neural network
  3. deep learning
  4. driver gaze zone estimation
  5. eye tracking

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • the Shandong Provincial Key R&D Program
  • the National Natural Science Foundation of China

Conference

BDSIC 2020

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)3
Reflects downloads up to 10 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Fine-grained gaze estimation based on the combination of regression and classification lossesApplied Intelligence10.1007/s10489-024-05778-3Online publication date: 3-Sep-2024
  • (2022)An Analytical Model for Estimating Average Driver Attention Based on the Visual Field2022 7th International Conference on Signal and Image Processing (ICSIP)10.1109/ICSIP55141.2022.9886244(586-590)Online publication date: 20-Jul-2022
  • (2022)Young Driver Gaze (YDGaze): Dataset for driver gaze analysis2022 International Conference on Artificial Intelligence of Things (ICAIoT)10.1109/ICAIoT57170.2022.10121856(1-6)Online publication date: 29-Dec-2022
  • (2021)Driver Gaze Zone Estimation via Head Pose Fusion Assisted Supervision and Eye Region Weighted EncodingIEEE Transactions on Consumer Electronics10.1109/TCE.2021.312700667:4(275-284)Online publication date: 1-Nov-2021
  • (2021)Driving distraction detection based on gaze activityElectronics Letters10.1049/ell2.1228657:22(857-859)Online publication date: 4-Aug-2021

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media