Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Remote photoplethysmography (rPPG) based learning fatigue detection

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Remote photoplethysmography (rPPG), which uses a facial video to measure skin reflection variations, is a contactless method for monitoring human cardiovascular activity. Due to its simplicity, convenience and potential in large-scale application, rPPG has gained more attention over the decade. However, the accuracy, reliability, and computational complexity have not reached the expected standards, thus rPPG has a very limited application in the educational field. In order to alleviate this issue, this study proposes an rPPG-based learning fatigue detection system, which consists of the following three modules. First, we propose an rPPG extraction module, which realizes real-time pervasive biomedical signal monitoring. Second, we propose an rPPG reconstruction module, which evaluates heart rate using a hybrid of 1D and 2D deep convolutional neural network approach. Third, we propose a learning fatigue classification module based on multi-source feature fusion, which classifies a learner’s state into non-fatigue and fatigue. In order to verify the performance, the proposed system is tested on a self-collected dataset. Experimental results demonstrate that (i) the accuracy of heart rate evaluation is better than the cutting-edge methods; and (ii) based on both the subject-dependent and independent cross validations, the proposed system succeeded in not only learning person-independent features for fatigue detection but also detecting early fatigue with very high accuracy.

Graphical abstract

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Availability

The datasets generated and analysed during the current study are available from the corresponding author on reasonable request.

Notes

  1. https://www.nhtsa.gov/risky-driving/drowsy-driving

  2. http://www.sce.carleton.ca/faculty/chan/matlab/

  3. https://github.com/qiriro/PPG

  4. https://pypi.org/project/biosppy/

  5. https://pypi.org/project/heartpy/

  6. https://pypi.org/project/pyhrv/

  7. https://pypi.org/project/hrv-analysis/

  8. http://sleepdisordersflorida.com/pvt1.html#responseOut

  9. https://www.nasa.gov/feature/ames/fighting-fatigue-app/

References

  1. Ambrosanio M, Franceschini S, Grassini G, Baselice F (2019) A multi-channel ultrasound system for non-contact heart rate monitoring. IEEE Sens J 20(4):2064–2074

    Google Scholar 

  2. Verkruysse W, Svaasand LO, Nelson JS (2008) Remote plethysmographic imaging using ambient light. Opt Express 16(26):21435–21445

    Google Scholar 

  3. Qayyum A, Mazher M, Nuhu A, Benzinou A, Malik AS, Razzak I (2022) Assessment of physiological states from contactless face video: a sparse representation approach. Computing 1-21

  4. Fouad RM, Omer OA, Aly MH (2019) Optimizing remote photoplethysmography using adaptive skin segmentation for real-time heart rate monitoring. IEEE Access 7:76513–76528

    Google Scholar 

  5. Poh M, Mcduff DJ, Picard RW (2010) Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Opt Express 18(10):10762–10774

    Google Scholar 

  6. Wu C, Yuan Z, Wan S, Wang L (2022) Anti-jamming heart rate estimation using a spatial-temporal fusion network. Comput Vis Image Underst 216:103327

    Google Scholar 

  7. Wang W, Brinker BD, Stuijk S, de Hann G (2017) Algorithmic principles of remote PPG. IEEE Trans Biomed Eng 64(7):1479–1491

    Google Scholar 

  8. Bousefsaf F, Maaoui C, Pruski A (2013) Continuous wavelet filtering on webcam photoplethysmographic signals to remotely assess the instantaneous heart rate. Biomed Signal Process Control 8(6):568–574

    Google Scholar 

  9. Bousefsaf F, Maaoui C, Pruski A (2017) Automatic selection of webcam photoplethysmographic pixels based on lightness criteria. J Med Biol Eng 37(3):374–385

    Google Scholar 

  10. Zhan Q, Wang W, Haan GD (2020) Analysis of CNN-based remote-PPG to understand limitations and sensitivities. Biomed Opt Express 11(3):1268–1283

    Google Scholar 

  11. Wang D, Yang X, Liu X, Jing J, Fang S (2020) Detail-preserving pulse wave extraction from facial videos using consumer-level camera. Biomed Opt Express 11(4):1876–1891

    Google Scholar 

  12. Benezeth Y, Li P, Macwan R, Nakamura K, Gomez R, Yang F (2018) Remote heart rate variability for emotional state monitoring, in IEEE EMBS, 153-156

  13. Sabour RM, Benezeth Y, Oliveira PD, Chappe J, Yang F (2021) UBFC-Phys: a multimodal database for psychophysiological studies of social stress. IEEE Trans Affect Comput 14(1):622–636

    Google Scholar 

  14. Nikolaiev S, Telenyk S, Tymoshenko Y (2020) Non-contact video-based remote photoplethysmography for human stress detection. J Autom Mobile Robot Intell Syst 14(2):63–73

    Google Scholar 

  15. Luchi K, Mitsuhashi R, Goto T, Matsubara A, Hirayama T, Hashizume H, Tsumura N (2020) Stress levels estimation from facial video based on non-contact measurement of pulse wave. Artif Life Robot 25:335–342

    Google Scholar 

  16. Maior CBS, Moura MJC, Santana JMM, Lins ID (2020) Real-time classification for autonomous drowsiness detection using eye aspect ratio. Expert Syst Appl 158:113505

    Google Scholar 

  17. Ramzan M, Khan HU, Awan SM, Ismail A, Ilyas M, Mahmood A (2019) A survey on state-of-the-art drowsiness detection techniques. IEEE Access 7:61904–61919

    Google Scholar 

  18. Zhao G, Liu S, Wang Q, Hu T (2018) Deep convolutional neural network for drowsy student state detection. Concurr Comp-pract E 30:e4457

    Google Scholar 

  19. Hu J, Zhang H (2021) Recognition of classroom student state features based on deep learning algorithms and machine learning. J Intell Fuzzy Syst 40(2):2361–2372

    Google Scholar 

  20. Zhou M, Zhang X (2019) Online social networking and subjective well-being: mediating effects of envy and fatigue. Comput Educ 140:103598

    Google Scholar 

  21. Zhao L, Li M, He Z, Ye S, Qin H, Zhu X, Dai Z (2022) Data-driven learning fatigue detection system: a multimodal fusion approach of ECG (electrocardiogram) and video signals. Measurement 201:111648

    Google Scholar 

  22. Zhang Y, Dong Z, Zhang K, Shu S, Lu F, Chen J (2021) Illumination variation-resistant video-based heart rate monitoring using LAB color space. Opt Lasers Eng 136

  23. Balakrishnan G, Durand F, Guttag J (2013) Detecting pulse from head motions in video, in CVPR, 3430-3437

  24. Liu X, Yang X, Wang D, Wong A (2021) Detecting pulse rates from facial videos recorded in unstable lighting conditions: an adaptive spatiotemporal homomorphic filtering algorithm. IEEE Trans Instrum Meas 70:1–15

    Google Scholar 

  25. Yu Y, Raveendran P, Lim CL, Kwan BH (2015) Dynamic heart rate estimation using principal component analysis. Biomed Opt Express 6(11):4610–4618

    Google Scholar 

  26. Holton BD, Mannapperuma K, Lesniewski PJ, Thomas JC (2013) Signal recovery in imaging photoplethysmography. Physiol Meas 34(11):1499–1511

    Google Scholar 

  27. Haan GD, Jeanne V (2013) Robust pulse rate from chrominance-based rPPG. IEEE Trans Biomed Eng 60(10):2878–2886

    Google Scholar 

  28. Wang C, Pun T, Chanel G (2018) A comparative survey of methods for remote heart rate detection from frontal face videos. Front Bioeng Biotechnol 6:33

    Google Scholar 

  29. Macwan R, Benezeth Y, Mansouri A (2018) Remote photoplethysmography with constrained ICA using periodicity and chrominance constraints. Biomed Eng Online 17(1):1–22

    Google Scholar 

  30. Yang Z, Yang X, Jin J, Wu X (2019) Motion-resistant heart rate measurement from face videos using patch-based fusion. Signal Image Video Process 13(3):423–430

    Google Scholar 

  31. Hassan MA, Malik AS, Fofi D, Saad N, Meriaudeau F (2017) Novel health monitoring method using an RGB camera. Biomed Opt Express 8(11):4838–4854

    Google Scholar 

  32. Bal U (2015) Non-contact estimation of heart rate and oxygen saturation using ambient light. Biomed Opt Express 6(1):86–97

    Google Scholar 

  33. Li X, Chen J, Zhao G, Pietikainen M (2014) Remote heart rate measurement from face videos under realistic situations, in CVPR, 4264-4271

  34. Hassan H, Jaidka S, Dwyer VM, Hu S (2018) Assessing blood vessel perfusion and vital signs through retinal imaging photoplethysmography. Biomed Opt Express 9(5):315088

    Google Scholar 

  35. Welch P (1967) The use of fast Fourier transform for the estimation of power spectra: a method based on time averaging over short, modified periodograms. IEEE Trans Audio Electroacoust 15(2):70–73

    Google Scholar 

  36. Chen W, McDuff D (2018) DeepPhys: video-based physiological measurement using convolutional attention networks, in ECCV 349-365

  37. Hsu GJ, Xie R, Ambikapathi A, Chou K (2020) A deep learning framework for heart rate estimation from facial videos. Neurocomputing 417:155–166

    Google Scholar 

  38. Reiss A, Indlekofer I, Schmidt P, Laerhoven KV (2019) Deep PPG: large-scale heart rate estimation with convolutional neural networks. Sensors 19(14):3079

    Google Scholar 

  39. Biswas D, Everson L, Liu M, Panwar M, Verhoef B, Patki S, Kim CH, Acharyya A, Hoof CV, Konijnenburg M, Helleputte NV (2019) CorNET: deep learning framework for PPG based heart rate estimation and biometric identification in ambulant environment. IEEE Trans Biomed Circuits Syst 13(2):282–291

    Google Scholar 

  40. Ni A, Azarang A, Kehtarnavaz N (2021) A review of deep learning-based contactless heart rate measurement methods. Sensors 21(11):3719

    Google Scholar 

  41. Fortenbacher A, Pinkwart N, Yun HS (2017) [LISA] learning analytics for sensor-based adaptive learning, in Proc. LAK’17 592-593

  42. Ribeiro D, Teixeira C, Cardoso A (2018) Web-based platform for training in biomedical signal processing and classification: the particular case of EEG-based drowsiness detection. Int J Online Biomed Eng 14(03):164–171

    Google Scholar 

  43. Nakamura S, Darasawang P, Reinders H (2021) The antecedents of boredom in L2 classroom learning. System 98:102469

    Google Scholar 

  44. Fujiwara K, Abe E, Kamata K, Nakayama C, Suzuki Y, Yamakawa T, Hiraoka T, Kano M, Sumi Y, Masuda F, Matsuo M, Kadotani H (2019) Heart rate variability-based driver drowsiness detection and its validation with EEG. IEEE Trans Biomed Eng 66(6):1769–1778

    Google Scholar 

  45. Fu R, Wang H (2014) Detection of driving fatigue by using noncontact EMG and ECG signals measurement system. Int J Neural Syst 24(03):1450006

    Google Scholar 

  46. Sikander G, Anwar S (2019) Driver fatigue detection systems: a review. IEEE trans Intell Transp Syst 20(6):2339–2352

    Google Scholar 

  47. Choi M, Koo G, Seo M, Kim SW (2018) Wearable device-based system to monitor a driver’s stress, fatigue, and drowsiness. IEEE Trans Instrum Meas 67(3):634–645

  48. Monteiro TG, Skourup C, Zhang H (2020) Optimizing CNN hyperparameters for mental fatigue assessment in demanding maritime operations. IEEE Access 8:40402–40412

    Google Scholar 

  49. Pan T, Wang H, Si H, Li Y, Shang L (2021) Identification of pilots’ fatigue status based on electrocardiogram signals. Sensors 21(9):3003

    Google Scholar 

  50. Franceschini S, Ambrosanio M, Baselice F (2020) MUHD: a multi-channel ultrasound prototype for remote heartbeat detection, in BIODEVICES, 57-63

  51. Doudou M, Bouabdallah A, Berge-Cherfaoui V (2020) Driver drowsiness measurement technologies: current research, market solutions, and challenges. Int J ITS Res 18:297–319

    Google Scholar 

  52. Wang Y, Wang W, Zhou M, Ren A, Tian Z (2020) Remote monitoring of human vital signs based on 77-GHz mm-Wave FMCW radar. Sensors 20(10):2999

    Google Scholar 

  53. Du G, Li T, Li C, Liu PX, Li D (2021) Vision-based fatigue driving recognition method integrating heart rate and facial features. IEEE Trans Intell Transp Syst 22(5):3089–3100

    Google Scholar 

  54. Al-Naji A, Gibson K, Lee SH, Chahl J (2017) Monitoring of cardiorespiratory signal: principles of remote measurements and review of methods. IEEE Access 5:15776–15790

    Google Scholar 

  55. Deng W, Wu R (2019) Real-time driver-drowsiness detection system using facial features. IEEE Access 7:118727–118738

    Google Scholar 

  56. Sahayadhas A, Sundaraj K, Murugappan M (2021) Detecting driver drowsiness based on sensors: A review. Sensors 12(12):16937–16953

  57. You F, Gong Y, Tu H, Liang J, Wang H (2020) A fatigue driving detection algorithm based on facial motion information entropy, J Adv Transport 8851485

  58. Zhang L, Fu C, Hong H, Xue B, Gu X, Zhu X, Li C (2021) Non-contact dual-modality emotion recognition system by CW radar and RGB camera. IEEE Sens J 21(20):23198–23212

    Google Scholar 

  59. Zhao M, Adlib F, Katabi D (2018) Emotion recognition using wireless signals. Commun ACM 61(9):91–100

    Google Scholar 

  60. Siddiqui HUR, Saleem AA, Brown R, Bademci B, Lee E, Rustam F, Dudley S (2021) Non-invasive driver drowsiness detection system. Sensors 21(14):4833

    Google Scholar 

  61. Zhang Y, Tsujikawa M, Onishi Y (2019) Sleep/wake classification via remote PPG signals. in EMBC, 3226-3230

  62. Dong Z, Zhang M, Sun J, Cao T, Liu R, Wang Q, Liu D (2021) A fatigue driving detection method based on frequency modulated continuous wave radar. in ICCECE, 670-675

  63. Zhang J, Wu Y, Chen Y, Wang J, Huang J, Zhang Q (2022) Ubi-Fatigue: towards ubiquitous fatigue detection via contactless sensing. IEEE Internet Things J 9(15):1–13

    Google Scholar 

  64. Cimr D, Busovsky D, Fujita H, Studnicka F, Cimler R, Hayashi T (2023) Classification of health deterioration by geometric invariants. Comput Methods Programs Biomed 239

  65. Sadek I, Biswas J, Abdulrazak B (2019) Ballistocardiogram signal processing: A review. Health Inf Sci Syst 7-10

  66. Kim C, Ober SL, McMurtry M, Finegan BA, Inan OT, Mukkamala R, Hahn J (2016) Ballistocardiogram: Mechanism and potential for unobtrusive cardiovascular health monitoring. Sci Rep 6:31297

    Google Scholar 

  67. Liu F, Li X, Lv T, Xu F (2019) A review of driver fatigue detection: progress and prospect. in ICCE, 1-6

  68. Sahayadhas A, Sundaraj K, Murugappan M (2012) Detecting driver drowsiness based on sensors: a review. Sensors 12(12):16937–16953

    Google Scholar 

  69. Shu L, Xie J, Yang M, Li Z, Li Z, Liao D, Xu X, Yang X (2018) A review of emotion recognition using physiological signals. Sensors 18(7):2074

    Google Scholar 

  70. Kessler V, Thiam P (2017) Multimodal fusion including camera photoplethysmography for pain recognition. in ICCV workshop, 1-4

  71. Tomasi C, Kanade T (1991) Detection and tracking of point features, in Technical Report CMU-CS-91-132, Carnegie Mellon University

  72. Liu S, Zhao L, Yang X, Du Y, Li M, Zhu X (2022) Remote fatigue detection based on the mmWave FMCW radar. IEEE Sens J 22(15):15222–15234

    Google Scholar 

  73. Subasi A, Kiymik MK (2010) Muscle fatigue detection in EMG using time-frequency methods, ICA and neural networks. J Med Syst 34:777–785

    Google Scholar 

  74. Niroshana SMI, Zhu X, Nakamura K, Chen W (2021) A fused-image-based approach to detect obstructive sleep apnea using a single-lead ECG and a 2D convolutional neural network. PLoS One 16(4)

  75. Khare SK, Bajaj V, Acharya UR (2021) SPWVD-CNN for automated detection of schizophrenia patients using EEG Signals. IEEE Trans Instrum Meas 70:1–9

    Google Scholar 

  76. Miranda-Correa JA, Abadi MK, Sebe N, Patras I (2021) AMIGOS: a dataset for affect, personality and mood research on individuals and groups. IEEE Trans Affect Comput 12(2):479–493

    Google Scholar 

  77. Wang S, Li H, Chang E, Wu A (2018) Entropy-assisted emotion recognition of valence and arousal using XGBoost classifier. in AIAI, 249–260

  78. Rosenstein MT, Collins JJ, De Luca CJ (1993) A practical method for calculating largest Lyapunov exponents from small data sets. Physica D 65(1–2):117–134

    MathSciNet  MATH  Google Scholar 

  79. Thakur A Approaching (almost) any machine learning problem, https://github.com/abhishekkrthakur/approachingalmost

  80. Qin P, Wang M, Chen Z, Yan G, Yan T, Han C, Bao Y, Wang X (2021) Characteristics of driver fatigue and fatigue-relieving effect of special light belt in extra-long highway tunnel: a real-road driving study. Tunn Undergr Space Technol 114

  81. Sikander G, Anwar S (2020) A novel machine vision-based 3D facial action unit identification for fatigue detection. IEEE Trans Intell Transp Syst 22(5):2730–2740

    Google Scholar 

  82. Basner M, Moore TM, Nasrini J, Gur RC, Dinges DF (2021) Response speed measurements on the psychomotor vigilance test: how precise is precise enough? Sleep 44(1):zsaa121

  83. Pagano TP, Santos LL, Santos VR, Miranda Sá PH, Bonfim YS, Paranhos JVD, Ortega LL, Nascimento LFS, Santos A, Rönnau MM, Winkler I, Nascimento EGS (2022) Remote heart rate prediction in virtual reality head-mounted displays using machine learning techniques. Sensors 22(23):9486

    Google Scholar 

  84. Spetlik R, Franc V, Matas J (2018) Visual heart rate estimation with convolutional neural network. in British Machine Vision Conference 3-6

  85. Yu Z, Li X, Zhao G (2019) Remote photoplethysmograph signal measurement from facial videos using spatio-temporal networks. in BMVC 1–12

  86. Lee E, Chen E, Lee C (2020) Meta-rPPG: remote heart rate estimation using a transductive meta-learner. in ECCV, 220514243

Download references

Acknowledgements

This work was supported in part by the National Key R &D Program of China (under Grant 2020AAA0108804), the National Natural Science Foundation of China (under Grants 61937001, 62077021, 62207018, 62277026), the Ministry of education of Humanities and Social Science project (under Grant 22YJC880117), the National Natural Science Foundation of Hubei Province (under Grant 2021CFB157), and the Fundamental Research Funds for the Central Universities (under Grant CCNU22LJ005).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Xiaoliang Zhu or Zhicheng Dai.

Ethics declarations

Disclosures

The authors declare that there are no conflicts of interest related to this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

As a widely used open access dataset, MAHNOB-HCI contains 27 subjects in total [36, 84,85,86]. Each subject took part in two experiments: (i) emotion elicitation and (ii) implicit tagging. Finally, 20 high resolution videos were collected for each subject. Following [86], a 30 second interval (frames from 306 to 2135) of 527 sequences is used in this study. The experimental results are shown in Table 9, where four metrics are again used for comparison. From this table, we can see that, the proposed system outperforms the state-of-the-art methods in the aspect of SD and Corr. Take the SD as an example. Compared with [86], an improvement of 1.11 is achieved by the proposed system. In addition, it should be noted that, the training time of the proposed model and [86] on the same processor is 2.0 h and 10.9 h, respectively, i.e., the computational complexity of the proposed system is much lower.

Table 9 Comparison of different DL methods on MAHNOB-HCI

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, L., Zhang, X., Niu, X. et al. Remote photoplethysmography (rPPG) based learning fatigue detection. Appl Intell 53, 27951–27965 (2023). https://doi.org/10.1007/s10489-023-04926-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-023-04926-5

Keywords