Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

Robust Sensor-Orientation-Independent Feature Selection for Animal Activity Recognition on Collar Tags

Published: 26 March 2018 Publication History

Abstract

Fundamental challenges faced by real-time animal activity recognition include variation in motion data due to changing sensor orientations, numerous features, and energy and processing constraints of animal tags. This paper aims at finding small optimal feature sets that are lightweight and robust to the sensor's orientation. Our approach comprises four main steps. First, 3D feature vectors are selected since they are theoretically independent of orientation. Second, the least interesting features are suppressed to speed up computation and increase robustness against overfitting. Third, the features are further selected through an embedded method, which selects features through simultaneous feature selection and classification. Finally, feature sets are optimized through 10-fold cross-validation. We collected real-world data through multiple sensors around the neck of five goats. The results show that activities can be accurately recognized using only accelerometer data and a few lightweight features. Additionally, we show that the performance is robust to sensor orientation and position. A simple Naive Bayes classifier using only a single feature achieved an accuracy of 94 % with our empirical dataset. Moreover, our optimal feature set yielded an average of 94 % accuracy when applied with six other classifiers. This work supports embedded, real-time, energy-efficient, and robust activity recognition for animals.

References

[1]
Louis Atallah, Benny Lo, Rachel King, and Guang Zhong Yang. 2011. Sensor positioning for activity recognition using wearable accelerometers. IEEE Transactions on Biomedical Circuits and Systems 5, 4 (2011), 320--329.
[2]
Jonghun Baek, GeehyukLee, Wonbae Park, and Bj Yun. 2004. Accelerometer signal processing for user activity detection. Knowledge-Based Intelligent Information and Engineering Systems Lecture No (2004), 610--617.
[3]
Jamali Firmat Banzi. 2014. A Sensor Based Anti-Poaching System in Tanzania. International Journal of Scientific and Research Publications 4, 4 (2014), 1--7.
[4]
Ling Bao and Stephen S. Intille. 2004. Activity Recognition from User-Annotated Acceleration Data. In Pervasive Computing. Springer Berlin Heidelberg, Berlin, Heidelberg, 1--17.
[5]
Pablo Bermejo, José A. Gámez, and José M. Puerta. 2014. Speeding up incremental wrapper feature subset selection with Naive Bayes classifier. Knowledge-Based Systems 55 (2014), 140--147.
[6]
Owen R. Bidder, Hamish A. Campbell, Agustina Gómez-Laich, Patricia Urge, James Walker, Yuzhi Cai, Lianli Gao, Flavio Quintana, and Rory P. Wilson. 2014. Love thy neighbour: Automatic animal behavioural classification of acceleration data using the k-nearest neighbour algorithm. PLoS ONE 9, 2 (2014), 1--7.
[7]
Greg Bishop-Hurley, Dave Henry, Daniel Smith, Ritaban Dutta, James Hills, Richard Rawnsley, Andrew Hellicar, Greg Timms, Ahsan Morshed, Ashfaqur Rahman, Claire D'Este, and Yanfeng Shu. 2014. An investigation of cow feeding behavior using motion sensors. In 2014 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) Proceedings. IEEE, 1285--1290.
[8]
Steven J. Cooke, Scott G. Hinch, Martin Wikelski, Russel D. Andrews, Louise J. Kuchel, Thomas G. Wolcott, and Patrick J. Butler. 2004. Biotelemetry: A mechanistic approach to ecology. Trends in Ecology and Evolution 19, 6 (2004), 334--343.
[9]
Ritaban Dutta, Daniel Smith, Richard Rawnsley, Greg Bishop-Hurley, James Hills, Greg Timms, and Dave Henry. 2015. Dynamic cattle behavioural classification using supervised ensemble classifiers. Computers and Electronics in Agriculture 111 (2015), 18--28.
[10]
Jochen Fahrenberg, Friedrich Foerster, Manfred Smeja, and Wolfgang Müller. 1997. Assessment of posture and motion by multichannel piezoresistive accelerometer recordings. (1997).
[11]
Davide Figo, Pedro C. Diniz, Diogo R. Ferreira, and João M P Cardoso. 2010. Preprocessing techniques for context recognition from accelerometer data. Personal and Ubiquitous Computing 14, 7 (2010), 645--662.
[12]
Blanca Florentino-Liaño, Niamh O'Mahony, and Antonio Artés-Rodríguez. 2012. Human Activity Recognition Using Inertial Sensors with Invariance to Sensor Orientation. In 2012 3rd International Workshop on Cognitive Information Processing (CIP). 1--6.
[13]
F Foerster and J Fahrenberg. 2000. Motion pattern and posture: correctly assessed by calibrated accelerometers. Behavior research methods, instruments, 8 computers: a journal of the Psychonomic Society, Inc 32, 3 (2000), 450--457.
[14]
L A González, G J Bishop-hurley, R N Handcock, and C Crossman. 2015. Behavioral classification of data from collars containing motion sensors in grazing cattle. Computers and Electronics in Agriculture 110 (2015), 91--102.
[15]
Isabelle Guyon and André Elisseeff. 2003. An Introduction to Variable and Feature Selection. Journal of Machine Learning Research (JMLR)3, 3 (2003), 1157--1182. arXiv:1111.6189v1
[16]
Tâm Huynh and Bernt Schiele. 2005. Analyzing features for activity recognition. In Proceedings of the 2005 joint conference on Smart objects and ambient intelligence: innovative context-aware services: usages and technologies (sOc-EUSAI '05). 159--163.
[17]
I Inza, P Larrañaga, R Etxeberria, and B Sierra. 1999. Feature subset selection by Bayesian networks based optimization. Artificial Intelligence 123 (1999), 157--184.
[18]
Jacob Kamminga. 2017. Goat Orientation Data. online. (05 2017). http://ps.ewi.utwente.nl/Datasets.php
[19]
Dean M. Karantonis, Michael R. Narayanan, Merryn Mathie, Nigel H. Lovell, and Branko G. Celler. 2006. Implementation of a real-time human movement classifier using a triaxial accelerometer for ambulatory monitoring. IEEE Transactions on Information Technology in Biomedicine 10, 1 (2006), 156--167.
[20]
S B Kotsiantis. 2007. Supervised Machine Learning: A Review of Classification Techniques. Informatica, An International Journal of Computing and Informatics 3176, 31 (2007), 249--268.
[21]
Cassim Ladha, Nils Hammerla, Emma Hughes, Patrick Olivier, and Thomas Ploetz. 2013. Dog's life: Wearable Activity Recognition for Dogs. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing - UbiComp '13. 415.
[22]
Jonathan Lester, Tanzeem Choudhury, and Gaetano Borriello. 2006. A Practical Approach to Recognizing Physical Activities. Pervasive Computing 3968 (2006), 1--16.
[23]
Yunji Liang, Xingshe Zhou, Zhiwen Yu, Bin Guo, and Yue Yang. 2012. Energy efficient activity recognition based on low resolution accelerometer in smart phones. Advances in Grid and Pervasive Computing (2012), 122--136.
[24]
Jacques Marais, Solomon Petrus, Le Roux, Riaan Wolhuter, and Thomas Niesler. 2014. Automatic classification of sheep behaviour using 3-axis accelerometer data. In Proceedings of the twenty-fifth annual symposium of the Pattern Recognition Association of South Africa (PRASA). 1--6.
[25]
Paula Martiskainen, Mikko Järvinen, Jukka-Pekka Skön, Jarkko Tiirikainen, Mikko Kolehmainen, and Jaakko Mononen. 2009. Cow behaviour pattern recognition using a three-dimensional accelerometer and support vector machines. Applied Animal Behaviour Science 119, 1--2 (2009), 32--38.
[26]
B G Mathie, M. J., Coster, A. C., Lovell, N. H., 8 Celler. 2004. Accelerometry: providing an integrated, practical method for long-term, ambulatory monitoring of human movement. Physiological measurement 25, 2 (2004), R1.
[27]
MATLAB. 2015. version 8.6.0 (R2015b). The MathWorks Inc., Natick, Massachusetts.
[28]
Uwe Maurer, Anthony Rowe, Asim Smailagic, and Daniel Siewiorek. 2006. Location and Activity Recognition Using eWatch: A Wearable Sensor Platform. Ambient Intelligence in Everyday Life 3864 (2006), 86--102.
[29]
I Mierswa, M Wurst, R Klinkenberg, M Scholz, and T Euler. 2006. YALE: Rapid prototyping for complex data mining tasks. Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2006 (2006), 935--940.
[30]
Ran Nathan. 2008. An emerging movement ecology paradigm. Proceedings of the National Academy of Sciences of the United States of America 105, 49 (2008), 19050--19051.
[31]
Ran Nathan, Orr Spiegel, Scott Fortmann-Roe, Roi Harel, Martin Wikelski, and Wayne M Getz. 2012. Using tri-axial acceleration data to identify behavioral modes of free-ranging animals: general concepts and tools illustrated for griffon vultures. The Journal of experimental biology 215, Pt 6 (2012), 986--96.
[32]
Trung Thanh Ngo, Yasushi Makihara, Hajime Nagahara, Yasuhiro Mukaigawa, and Yasushi Yagi. 2015. Similar gait action recognition using an inertial sensor. Pattern Recognition 48, 4 (2015), 1285--1297.
[33]
Juha Pärkkä, Miikka Ermes, Panu Korpipää, Jani Mäntyjärvi, Johannes Peltola, and Ilkka Korhonen. 2006. Activity classification using realistic data from wearable sensors. IEEE transactions on information technology in biomedicine: a publication of the IEEE Engineering in Medicine and Biology Society 10, 1(2006), 119--128.
[34]
Julie K Petersen. 2002. Understanding Technologies Surveillance Spy Devices, Their Origins 8 Applications. CRC Press, Boca Raton.
[35]
R. L. Plackett. 1983. Karl Pearson and the Chi-Squared Test. International Statistical Review /Revue Internationale de Statistique 51, 1 (1983), 59.
[36]
S.J. Preece, J.Y. Goulermas, L.P.J. Kenney, and D. Howard. 2009. A Comparison of Feature Extraction Methods for the Classification of Dynamic Activities From Accelerometer Data. IEEE Transactions on Biomedical Engineering 56, 3 (2009), 871--879.
[37]
Nishkam Ravi, Nikhil Dandekar, Preetham Mysore, and Ml Michael L Littman. 2005. Activity Recognition from Accelerometer Data. In Aaai. 1541--1546.
[38]
Sasank Reddy, Min Mun, Jeff Burke, Deborah Estrin, Mark Hansen, and Mani Srivastava. 2010. Using mobile phones to determine transportation modes. ACM Transactions on Sensor Networks 6, 2 (2010), 1--27.
[39]
Yasar Guneri Sahin. 2007. Animals as Mobile Biological Sensors for Forest Fire Detection. Sensors 7 (2007), 3084--3099.
[40]
Jeff Schneider. 1997. Cross Validation. online. (1997). https://www.cs.cmu.edu/~schneide/tut5/node42.html
[41]
Emily L C Shepard, Rory P. Wilson, Flavio Quintana, Agustina Gomez Laich, Nikolai Liebsch, Diego A. Albareda, Lewis G. Halsey, Adrian Gleiss, David T. Morgan, Andrew E. Myers, Chris Newman, and David W. Macdonald. 2010. Identification of animal movement patterns using tri-axial accelerometry. Endangered Species Research 10, 1 (2010), 47--60.
[42]
Muhammad Shoaib, Stephan Bosch, Ozlem Incel, Hans Scholten, and Paul Havinga. 2015. A Survey of Online Activity Recognition Using Mobile Phones. Sensors 15, 1 (2015), 2059--2085.
[43]
Pekka Siirtola and Juha Röning. 2012. Recognizing Human Activities User-independently on Smartphones Based on Accelerometer Data. International Journal of Interactive Multimedia and Artificial Intelligence 1, 5 (2012), 38.
[44]
Daniel Smith, Bryce Little, Paul I. Greenwood, Philip Valencia, Ashfaqur Rahman, Aaron Ingham, Greg Bishop-Hurley, Md Sumon Shahriar, and Andrew Hellicar. 2015. A study of sensor derived features in cattle behaviour classification models. In 2015 IEEE SENSORS -Proceedings. IEEE, Busan, South Korea, 1--4.
[45]
Daniel Smith, Ashfaqur Rahman, Greg J. Bishop-Hurley, James Hills, Sumon Shahriar, David Henry, and Richard Rawnsley. 2016. Behavior classification of cows fitted with motion collars: Decomposing multi-class classification into a set of binary problems. Computers and Electronics in Agriculture 131 (2016), 40--50.
[46]
J Sneddon and A Mason. 2014. Automated Monitoring of Foraging Behaviour in Free Ranging Sheep Grazing a Bio-diverse Pasture using Audio and Video Information. In 8th International Conference on Sensing Technology. 2--4.
[47]
Inertia Technology. 2017. ProMove mini. online. (2017). http://inertia-technology.com/
[48]
Jorge A. Vázquez Diosdado, Zoe E. Barker, Holly R. Hodges, Jonathan R. Amory, Darren P. Croft, Nick J. Bell, and Edward A. Codling. 2015. Classification of behaviour in housed dairy cows using an accelerometer-based activity monitoring system. Animal Biotelemetry 3, 1(2015), 15.
[49]
W3C. 2017. Motion Sensors Explainer. online. (08 2017). https://www.w3.org/TR/motion-sensors/
[50]
Shinichi Watanabe, Masako Izawa, Akiko Kato, Yan Ropert-Coudert, and Yasuhiko Naito. 2005. A new technique for monitoring the detailed behaviour of terrestrial animals: A case study with the domestic cat. Applied Animal Behaviour Science 94, 1--2 (2005), 117--131.
[51]
Rory P. Wilson, E. L C Shepard, and N. Liebsch. 2008. Prying into the intimate details of animal lives: Use of a daily diary on animals. Endangered Species Research 4, 1--2(2008), 123--137.
[52]
Xiuxin Yang, A Dinh, and Li Chen. 2010. Implementation of a wearerable real-time system for physical activity recognition based on naive Bayes classifier. Bioinformatics and Biomedical Technology (ICBBT), 2010 International Conference on (2010), 101--105.
[53]
K. Yoda, Katsufumi Sato, Y. Niizuma, M. Kurita, Charles-André Bost, Yvon Le Maho, and Y. Naito. 1999. Precise monitoring of porpoising behaviour of Adélie penguins determined using acceleration data loggers. The Journal of experimental biology 202, Pt 22 (1999), 3121--3126.
[54]
Mi Zhang and Alexander a Sawchuk. 2011. A feature selection-based framework for human activity recognition using wearable multimodal sensors. In Proceedings of the 6th International Conference on Body Area Networks. 92--98.

Cited By

View all
  • (2024)Goats on the Move: Evaluating Machine Learning Models for Goat Activity Analysis Using Accelerometer DataAnimals10.3390/ani1413197714:13(1977)Online publication date: 4-Jul-2024
  • (2024)TS2ACTProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314457:4(1-22)Online publication date: 12-Jan-2024
  • (2024)ADA-SHARKProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314167:4(1-25)Online publication date: 12-Jan-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 2, Issue 1
March 2018
1370 pages
EISSN:2474-9567
DOI:10.1145/3200905
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 March 2018
Accepted: 01 January 2018
Revised: 01 November 2017
Received: 01 May 2017
Published in IMWUT Volume 2, Issue 1

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Animal Activity Recognition
  2. Decision Tree
  3. Embedded Systems
  4. Machine Learning
  5. Naive Bayes
  6. Sensor Orientation

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • Netherlands Organisation for Scientific Research (NWO).
  • University of Twente, Wageningen University 8 Research, ASTRON Dwingeloo, and Leiden University.

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)175
  • Downloads (Last 6 weeks)16
Reflects downloads up to 22 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Goats on the Move: Evaluating Machine Learning Models for Goat Activity Analysis Using Accelerometer DataAnimals10.3390/ani1413197714:13(1977)Online publication date: 4-Jul-2024
  • (2024)TS2ACTProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314457:4(1-22)Online publication date: 12-Jan-2024
  • (2024)ADA-SHARKProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314167:4(1-25)Online publication date: 12-Jan-2024
  • (2024)Livestock feeding behaviour: A review on automated systems for ruminant monitoringBiosystems Engineering10.1016/j.biosystemseng.2024.08.003246(150-177)Online publication date: Oct-2024
  • (2023)Classification of behaviors of free-ranging cattle using accelerometry signatures collected by virtual fence collarsFrontiers in Animal Science10.3389/fanim.2023.10832724Online publication date: 14-Apr-2023
  • (2023)Investigating the Effect of Orientation Variability in Deep Learning-based Human Activity RecognitionAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610742(480-485)Online publication date: 8-Oct-2023
  • (2023)A non-Restraining Sheep Activity Detection and Surveillance using Deep Machine Learning2023 16th International Conference on Developments in eSystems Engineering (DeSE)10.1109/DeSE60595.2023.10469582(66-72)Online publication date: 18-Dec-2023
  • (2023)A full end-to-end deep approach for detecting and classifying jaw movements from acoustic signals in grazing cattleEngineering Applications of Artificial Intelligence10.1016/j.engappai.2023.106016121:COnline publication date: 1-May-2023
  • (2023)A teacher-to-student information recovery method toward energy-efficient animal activity recognition at low sampling ratesComputers and Electronics in Agriculture10.1016/j.compag.2023.108242213:COnline publication date: 1-Oct-2023
  • (2023)Deep learning-based animal activity recognition with wearable sensorsComputers and Electronics in Agriculture10.1016/j.compag.2023.108043211:COnline publication date: 1-Aug-2023
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media