Abstract
Smartphone based activity recognition has recently received remarkable attention in various applications of mobile health such as safety monitoring, fitness tracking, and disease prediction. To achieve more accurate and simplified medical monitoring, this paper proposes a self-learning scheme for patients’ activity recognition, in which a patient only needs to carry an ordinary smartphone that contains common motion sensors. After the real-time data collection though this smartphone, we preprocess the data using coordinate system transformation to eliminate phone orientation influence. A set of robust and effective features are then extracted from the preprocessed data. Because a patient may inevitably perform various unpredictable activities that have no apriori knowledge in the training dataset, we propose a self-learning activity recognition scheme. The scheme determines whether there are apriori training samples and labeled categories in training pools that well match with unpredictable activity data. If not, it automatically assembles these unpredictable samples into different clusters and gives them new category labels. These clustered samples combined with the acquired new category labels are then merged into the training dataset to reinforce recognition ability of the self-learning model. In experiments, we evaluate our scheme using the data collected from two postoperative patient volunteers, including six labeled daily activities as the initial apriori categories in the training pool. Experimental results demonstrate that the proposed self-learning scheme for activity recognition works very well for most cases. When there exist several types of unseen activities without any apriori information, the accuracy reaches above 80 % after the self-learning process converges.
Similar content being viewed by others
References
Győrbíró, N., Fábián, Á., and Hományi, G., An activity recognition system for mobile phones[J]. Mobile Netw Appl 14(1):82–91, 2009.
Arif, M., Bilal, M., Kattan, A., et al., Better physical activity classification using Smartphone acceleration sensor[J]. J Med Syst 38(9):1–10, 2014.
Moran, A. L., Ramrez-Fernandez, C., Meza-Kubo, V., Orihuela-Espina, F., Garca-Canseco, E., Grimaldo, A. I., and Sucar, E., On the e_ect ofprevious technological experience on the usability of a virtual rehabilitation tool forthe physical activation and cognitive stimulation of elders. J Med Syst 39(9):1–11, 2015.
Poppe, R., A survey on vision-based human action recognition[J]. Image Vis Comput 28(6):976–990, 2010.
Turaga, P., Chellappa, R., Subrahmanian, V. S., et al., Machine recognition of human activities: A survey[J]. IEEE Trans Circ Syst Video Technol 18(11):1473–1488, 2008.
Bao, L., and Intille, S. S., Activity recognition from user-annotated acceleration data[M]//Pervasive computing. Springer, Berlin Heidelberg, pp. 1–17, 2004.
Maurer, U., Rowe, A., Smailagic, A., et al., Location and activity recognition using eWatch: A wearable sensor platform[M]//Ambient Intelligence in Everyday Life. Springer: BerlinHeidelberg: 86–102, 2006.
Maurer, U., Smailagic, A., Siewiorek, D. P., et al., Activity recognition and monitoring using multiple sensors on different body positions[C]//Wearable and Implantable Body Sensor Networks, 2006. BSN 2006. International Workshop on. IEEE, 4 pp.-116, 2006.
Incel, O. D., Kose, M., and Ersoy, C., A review and taxonomy of activity recognition on mobile phones[J]. Bio Nano Sci 3(2):145–171, 2013.
Keally, M., Zhou, G., Xing, G., et al., Pbn: towards practical activity recognition using smartphone-based body sensor networks[C]//Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems. ACM: 246–259, 2011.
Blanke, U., Schiele, B., Sensing location in the pocket[J]. Ubicomp Poster Session. 2, 2008.
Yang, J., Toward physical activity diary: motion recognition using simple acceleration features with mobile phones[C]//Proceedings of the 1st international workshop on Interactive multimedia for consumer electronics. ACM: 1–10, 2009.
Khan, M., Ahamed, S. I., Rahman, M., et al., A feature extraction method for realtime human activity recognition on cell phones[C]//Proceedings of 3rd International Symposium on Quality of Life Technology (isQoLT 2011). Toronto, Canada. 2011.
Thiemjarus, S., A device-orientation independent method for activity recognition[C]//Body Sensor Networks (BSN), 2010 International Conference on. IEEE. 19–23, 2010.
Theekakul, P., Thiemjarus, S., Nantajeewarawat, E., et al., A rule-based approach to activity recognition[M]//Knowledge, Information, and Creativity Support Systems. Springer, Berlin Heidelberg, pp. 204–215, 2011.
Sun, L., Zhang, D., Li, B., et al., Activity recognition on an accelerometer embedded mobile phone with varying positions and orientations[M]//Ubiquitous intelligence and computing. Springer, Berlin Heidelberg, pp. 548–562, 2010.
Henpraserttae, A., Thiemjarus, S., Marukatat, S., Accurate activity recognition using a mobile phone regardless of device orientation and location[C]//Body Sensor Networks (BSN), 2011 International Conference on. IEEE. 41–46, 2011.
Cheng, H. T., Sun, F. T., Griss, M., et al., Nuactiv: Recognizing unseen new activities using semantic attribute-based learning[C]//Proceeding of the 11th annual international conference on Mobile systems, applications, and services. ACM. 361–374, 2013.
Yin, J., Yang, Q., and Pan, J. J., Sensor-based abnormal human-activity detection[J]. IEEE Trans Knowl Data Eng 20(8):1082–1090, 2008.
Ho, Y., Lu, C., Chen, I., et al., Active-learning assisted self-reconfigurable activity recognition in a dynamic environment[C]//Proceedings of the 2009 I.E. international conference on Robotics and Automation. IEEE Press: 1567–1572, 2009.
Ustev, Y. E., Durmaz, I. O., Ersoy, C., User, device and orientation independent human activity recognition on mobile phones: Challenges and a proposal[C]//Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication. ACM: 1427–1436, 2013.
Anguita, D., Ghio, A., Oneto, L., et al., A public domain dataset for human activity recognition using smartphones[C]//European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN. 2013.
Müller, M., Dynamic time warping[J]. Inform Retr Music Motion. 69–84, 2007.
Foley, D. H., and Sammon, J. W., Jr., An optimal set of discriminant vectors[J]. IEEE Trans Comput 100(3):281–289, 1975.
Guo, Y. F., Wu, L., Lu, H., et al., Null Foley–Sammon transform[J]. Pattern Recogn 39(11):2248–2251, 2006.
Breiman, L., Random forests[J]. Mach Learn 45(1):5–32, 2001.
Ho, T. K., The random subspace method for constructing decision forests[J]. IEEE Trans Pattern Anal Mach Intell 20(8):832–844, 1998.
Van der Maaten, L., and Hinton, G., Visualizing data using t-SNE[J]. J Mach Learn Res 9(2579–2605):85, 2008.
Ester, M., Kriegel, H. P., Sander, J., et al., A density-based algorithm for discovering clusters in large spatial databases with noise[C]//Kdd. 96(34): 226–231, 1996.
Schölkopf, B., Platt, J. C., Shawe-Taylor, J., et al., Estimating the support of a high-dimensional distribution[J]. Neural Comput 13(7):1443–1471, 2001.
Ko, H., Baran, R., Arozullah, M., Neural network based novelty filtering for signal detection enhancement[C]//Circuits and Systems, 1992., Proceedings of the 35th Midwest Symposium on. IEEE: 252–255, 1992.
Bishop, C. M., Novelty detection and neural network validation[C]//Vision, Image and Signal Processing, IEE Proceedings-. IET. 141(4): 217–222, 1994.
Muñoz, A., and Muruzábal, J., Self-organizing maps for outlier detection[J]. Neurocomputing 18(1):33–60, 1998.
Zheng, W., Zhao, L., and Zou, C., Foley-Sammon optimal discriminant vectors using kernel approach[J]. IEEE Trans Neural Netw 16(1):1–9, 2005.
Lin, Y., Gu, G., Liu, H., et al., Kernel null foley-sammon transform[C]//Computer Science and Software Engineering, 2008 International Conference on. IEEE. 1: 981–984, 2008.
Kwapisz, J. R., Weiss, G. M., and Moore, S. A., Activity recognition using cell phone accelerometers[J]. ACM SigKDD Explorations Newsl 12(2):74–82, 2011.
Acknowledgments
This research is sponsored by National Natural Science Foundation of China (No.61401029, 61171014, 61472044, 61472403, 61371185), Beijing Advanced Innovation Center for Future Education (BJAICFE2016IR-004), the Fundamental Research Funds for the Central Universities (No.2012LYB46), and Beijing Youth Excellence Program (YETP0296).
Author information
Authors and Affiliations
Corresponding author
Additional information
This article is part of the Topical Collection on Patient Facing Systems
Appendix
Appendix
Rights and permissions
About this article
Cite this article
Guo, J., Zhou, X., Sun, Y. et al. Smartphone-Based Patients’ Activity Recognition by Using a Self-Learning Scheme for Medical Monitoring. J Med Syst 40, 140 (2016). https://doi.org/10.1007/s10916-016-0497-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10916-016-0497-2