Unsupervised scalable representation learning for multivariate time series

JY Franceschi, A Dieuleveut… - Advances in neural …, 2019 - proceedings.neurips.cc
Advances in neural information processing systems, 2019proceedings.neurips.cc
Time series constitute a challenging data type for machine learning algorithms, due to their
highly variable lengths and sparse labeling in practice. In this paper, we tackle this
challenge by proposing an unsupervised method to learn universal embeddings of time
series. Unlike previous works, it is scalable with respect to their length and we demonstrate
the quality, transferability and practicability of the learned representations with thorough
experiments and comparisons. To this end, we combine an encoder based on causal dilated …
Abstract
Time series constitute a challenging data type for machine learning algorithms, due to their highly variable lengths and sparse labeling in practice. In this paper, we tackle this challenge by proposing an unsupervised method to learn universal embeddings of time series. Unlike previous works, it is scalable with respect to their length and we demonstrate the quality, transferability and practicability of the learned representations with thorough experiments and comparisons. To this end, we combine an encoder based on causal dilated convolutions with a novel triplet loss employing time-based negative sampling, obtaining general-purpose representations for variable length and multivariate time series.
proceedings.neurips.cc