Authors:
Ali Ismail-Fawaz
;
Maxime Devanne
;
Jonathan Weber
and
Germain Forestier
Affiliation:
IRIMAS, Université Haute-Alsace, Mulhouse, France
Keyword(s):
Self Supervised, Time Series Classification, Semi Supervised, Triplet Loss.
Abstract:
Self-Supervised Learning (SSL) is a range of Machine Learning techniques having the objective to reduce the amount of labeled data required to train a model. In Deep Learning models, SSL is often implemented using specific loss functions relying on pretext tasks leveraging from unlabelled data. In this paper, we explore SSL for the specific task of Time Series Classification (TSC). In the last few years, dozens of Deep Learning architectures were proposed for TSC. However, they almost exclusively rely on the traditional training step involving only labeled data in sufficient numbers. In order to study the potential of SSL for TSC, we propose the TRIplet Loss In TimE (TRILITE) which relies on an existing triplet loss mechanism and which does not require labeled data. We explore two use cases. In the first one, we evaluate the interest of TRILITE to boost a supervised classifier when very few labeled data are available. In the second one, we study the use of TRILITE in the context of s
emi-supervised learning, when both labeled and unlabeled data are available. Experiments performed on 85 datasets from the UCR archive reveal interesting results on some datasets in both use cases.
(More)