Abstract
Monitoring students' attention and engagement levels in classrooms are critical in promoting an interactive teaching process. Therefore, in addition to teaching, instructors are constantly making an effort to monitor and maintain the students' attention and engagement levels throughout the lecture session. Recent advancements in computer vision techniques have resulted in tools that empower real-time head-pose estimation for multiple people with reasonable accuracy. This study aims to harness such technology to help instructors determine and evaluate faster and more accurately the level of students' attention, thus supporting them in managing the session appropriately and more beneficial to the students. Our system employs an ordinary webcam, a standard computer, and simple computer vision algorithms. The system determines the students' faces, estimates their level of attention, and displays them through easy color-coded charts to the instructor to take the necessary action during the lecture. To avoid complicating the system, the metric given is a percentage of attention and associated color-coded level of concern so that the instructor can act. Our results show that the proposed system can numerically evaluate students' attention in the classroom, individually and in-group, in real-time. Our scheme was tested a classroom with 8 students, and the instructors' feedbacks were positive. From the results, for instance, a student shows 89% instant attention, and the overall classroom attention was 40% after 11 minutes of starting the lecture. Similar measurements can be obtained from each student and the overall students, at any time during the lecture, which is novel compared to other similar studies that only target either the student or the overall class in mostly offline mode or complicated and unpractical setup ((more detailed comparisons are in the introduction). We assume that this work has potential in significantly enhancing instructors' teaching abilities and students' academic performance.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Carini, R. M., Kuh, G. D., & Klein, S. P. (2006). Student engagement and student learning: Testing the linkages. Research in Higher Education, 47(1), 1–32.
Hagenauer, G., Hascher, T., & Volet, S. E. (2015). Teacher emotions in the classroom: Associations with students’ engagement, classroom discipline and the interpersonal teacher–student relationship. European Journal of Psychology of Education, 30(4), 385–403.
Blatchford, P., Bassett, P., & Brown, P. (2011). Examining the effect of class size on classroom engagement and teacher–pupil interaction: Differences in relation to pupil prior attainment and primary vs. secondary schools. Learning and Instruction, 21(6), 715–730.
Al’Omairi, T., & Al Balushi, H. (2015). The influence of paying attention in classroom on students’academic achievement in terms of their comprehension and recall ability. In 2ND international conference on education and social sciences (INTCESS’15) (pp. 684-693)
Cotton, K. J. (1988) Monitoring student learning in the classroom. School improvement research series close-up# 4.
Raca, M., Kidzinski, L., & Dillenbourg, P. (2015). Translating head motion into attention-towards processing of student’s body-language. In Proceedings of the 8th International Conference on Educational Data Mining.
Eriksson, J., & Anna, L. (2015) Measuring Student Attention with Face Detection::Viola-Jones versus Multi-Block Local Binary Pattern using OpenCV ed..
Raca, M., & Dillenbourg, P. (2013). System for assessing classroom attention. In Proceedings of the 3rd International Learning Analytics and Knowledge Conference.
Zaletelj, J., & Kosir, A. (2017). Predicting students’ attention in the classroom from Kinect facial and body features. EURASIP Journal on Image and Video Processing., 2017. https://doi.org/10.1186/s13640-017-0228-8
Canedo, D., Trifan, A., & Neves, A. J. R. (2018). Monitoring students’ attention in a classroom through computer vision. In Communications in Computer and Information Science International Conference on Practical Applications of Agents and Multi-Agent Systems. Berlin: Springer, 371–378.
Deng, Q., & Wu, Z. (2018). Students’ Attention Assessment in eLearning based on Machine Learning. IOP Conference Series: Earth and Environmental Science, 199, 032042.
Liu, N. H., Chiang, C. Y., & Chu, H. C. (2013). Recognizing the degree of human attention using EEG signals from mobile sensors. Sensors, 13(8), 10273–10286.
Chen, C.-M., Wang, J.-Y., & Yu, C.-M. (2017). Novel attention aware system based on brainwave signals. Br J Educ Technol, 48, 348–369. https://doi.org/10.1111/bjet.12359
Zhang, X., Wu, C.-W., Fournier-Viger, P., & Van, L.-D., and Tseng, Y.-C. (2017). Analyzing students’ attention in class using wearable devices. In 18th International Symposium on A World of Wireless, Mobile and Multimedia Networks (WoWMoM). IEEE Publications: Institute of Electrical and Electronics Engineers, pp. 1–9.
Li, Q., Ren, Y., Wei, T., Wang, C., Liu, Z., & Yue, J. (2020). A Learning Attention Monitoring System via Photoplethysmogram Using Wearable Wrist Devices. In N. Pinkwart & S. Liu (Eds.), Artificial Intelligence Supported Educational Technologies. Advances in Analytics for Learning and Teaching. Springer. https://doi.org/10.1007/978-3-030-41099-5_8
Zhu, Z., Ober, S. and Jafari, R. (2017) "Modeling and detecting student attention and interest level using wearable computers," 2017 IEEE 14th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pp. 13-18, doi: 10.1109/BSN.2017.7935996.
Hutt, S., Krasich, K.R., Brockmole, J., & D’Mello, K.S. (2021, May). Breaking out of the Lab: Mitigating Mind Wandering with Gaze-Based Attention-Aware Technology in Classrooms. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-14).
Ngoc Anh, B., Tung Son, N., Truong Lam, P., Phuong Chi, L., Huu Tuan, N., Cong Dat, N., Huu Trung, N., Umar Aftab, M., & Van Dinh, T. (2019). A Computer-Vision Based Application for Student Behavior Monitoring in Classroom. Applied Sciences, 9, 4729. https://doi.org/10.3390/app9224729
Broussard, D. M., Rahman, Y., Kulshreshth, A. K., & Borst, C. W. (2021). An Interface for Enhanced Teacher Awareness of Student Actions and Attention in a VR Classroom. In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) (pp. 284-290). IEEE.
Mallick, S. (2018). Head pose estimation using OpenCV and Dlib. Learn.
Sagonas, C., Antonakos, E., Tzimiropoulos, G., Zafeiriou, S., & Pantic, M. (2016). 300 Faces in-the-wild challenge: Database and results. Image and Vision Computing, 47, 3–18.
Alnajjar, F. S., Renawi, A. M., Cappuccio, M., and Mubain, O. (2019) A low-cost autonomous attention assessment system for robot intervention with autistic children. In 2019 IEEE Global Engineering Education Conference (EDUCON), 2019: IEEE, 787–792.
Piontkowski, D.; Calfee, R. (1979) Attention in the Classroom. In Attention and Cognitive Development; Hale, G.A., Lewis, M., Eds.; Springer: Boston, MA, USA, ; pp. 297–329.
Glass, A. L., & Kang, M. (2019). Dividing attention in the classroom reduces exam performance. Educ. Psychol., 39, 395–408.
Baepler, P., & Murdoch, C. J. (2010). Academic analytics and data mining in higher education. Int. J. Scholarsh. Teach. Learn., 4, 17.
Cicekci, M. & Sadik, F. (2019). Teachers’ and Students’ Opinions About Students’ Attention Problems During the Lesson. Journal of Education and Learning. 8. 15. 10.5539/jel.v8n6p15.
Wilson, J. (2013). Capturing students’ attention: An empirical study. Journal of the Scholarship of Teaching and Learning. 13. 1-20.
Goldberg, P., Sümer, Ö., Stürmer, K., Wagner, W., Göllner, R., Gerjets, P., et al. (2019). Attentive or Not? Toward a Machine Learning Approach to Assessing Students’ Visible Engagement in Classroom Instruction. Educational Psychology Review, 33(1), 27–49. https://doi.org/10.1007/s10648-019-09514-z
Liang, S., Sabri, A. Q. M., Alnajjar, F., & Loo, C. K. (2021). Autism Spectrum Self-Stimulatory Behaviors Classification Using Explainable Temporal Coherency Deep Features and SVM Classifier. IEEE Access, 9, 34264–34275.
Alnajjar, F., Cappuccio, M., Renawi, A., Mubin, O., & Loo, C. K. (2021). Personalized robot interventions for autistic children: An automated methodology for attention assessment. International Journal of Social Robotics, 13(1), 67–82.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest/Competing interests
The authors have no conflicts of interest to declare that are relevant to the content of this article. The authors have also no financial or proprietary interests in any material discussed in this article
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Renawi, A., Alnajjar, F., Parambil, M. et al. A simplified real-time camera-based attention assessment system for classrooms: pilot study. Educ Inf Technol 27, 4753–4770 (2022). https://doi.org/10.1007/s10639-021-10808-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10639-021-10808-5