Abstract
Learning processes are becoming increasingly mediated by computers and the internet. E-learning, especially Web-based learning, is penetrating into various fields. An ordinary e-Learning system usually evaluates learners’ states from their inputs to the pre-programmed questions. The time-sequentially changing emotions of learners are important for the management of the learning environment, just like face to-face classes. The present paper describes an emotion diagnosis system that can judge the emotions of an e-Learning user based on his/her eye movement speed and fixation duration time. The criteria for classifying four pairs of semantically different emotions (eight emotions in total) were established through a time-sequential subjective evaluation of the emotions of the subject and a time-sequential analysis of the eye movements of the subject. The achieved coincidence ratios between the discriminated emotions based on the criteria of emotion diagnosis and the time-sequential subjective evaluation for “tired” and “concentrating” were 77% and 60%, respectively.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Nosu, K., Kimura, H., Fukushima, M., Takahashi, K., Yu, C.M., Yamauchi, N., Tsukada, M.: An inspection of the learners’ responses to information ambience at distant lectures from a Japanese university to a Malaysian university. Educational Technology Research 27(1/2), 63–70 (2004)
Nosu, K., Kurokawa, T.: Facial tracking for an emotion-diagnosis robot to support e-Learning. In: Proceedings of the 2006 International Conference on Machine Learning and Cybernetics (ICMLC 2006), vol. 6, pp. 3811–3816 (2006)
Kurokawa, T., Okuda, A., Nosu, K.: A research on the estimation of the emotion states of e-Learning users by facial videos and biometric information. Japanese Journal of Ergonomics 42(6), 394–398 (2006)
Kurokawa, T., Nosu, K., Yamazaki, K.: A research on the time sequential estimation of emotions of e-learning users by the face and biometric information. The Journal of the Institute of Image Information and Television Engineers 61(12), 1779–1784 (2007)
Myers, G.A., Sherman, K.R., Stark, L.: Eye Monitor. Computer 24(3), 14–21 (1991)
Ramloll, R., Trepagnier, C., Sebrechts, M., Beedasy, J.: Gaze Data Visualization Tools: Opportunities and Challenges. In: Proceedings of Eighth International Conference on Information Visualization, IV 2004, July 14-16, pp. 173–180 (2004)
Nosu, K., Kanda, A., Koike, T.: Voice navigation in web-based learning materials - an investigation using eye tracking. IEICE Trans. Information and System 90-D(11), 1772–1778 (2007)
Cowing, R., et al.: Emotion recognition in human-computer interaction. IEEE Signal Processing Magazine, 33–80 (January 2001)
Kim, Y., Bae, J., Joen, B.: Students’ Visual Perceptions of Virtual Lectures as Measured by Eye Tracking. In: Jacko, J.A. (ed.) HCII 2009, Part 1. LNCS, vol. 5610, pp. 85–94. Springer, Heidelberg (2009)
Ekman, P.: Emotion in the Human Face. Cambridge University Press, Cambridge (1982)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Nosu, K., Koike, T., Shigeta, A. (2010). Time-Sequential Emotion Diagnosis Based on the Eye Movement of the User in Support of Web-Based e-Learning. In: Prasad, S.K., Vin, H.M., Sahni, S., Jaiswal, M.P., Thipakorn, B. (eds) Information Systems, Technology and Management. ICISTM 2010. Communications in Computer and Information Science, vol 54. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12035-0_26
Download citation
DOI: https://doi.org/10.1007/978-3-642-12035-0_26
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-12034-3
Online ISBN: 978-3-642-12035-0
eBook Packages: Computer ScienceComputer Science (R0)