Attend and Diagnose: Clinical Time Series Analysis Using Attention Models

Authors

  • Huan Song Arizona State University
  • Deepta Rajan IBM Almaden Research Center
  • Jayaraman Thiagarajan Lawrence Livermore National Labs
  • Andreas Spanias Arizona State University

DOI:

https://doi.org/10.1609/aaai.v32i1.11635

Keywords:

Attention model, Recurrent Neural Network, Time Series Analysis, Deep Learning, Clinical, Healthcare

Abstract

With widespread adoption of electronic health records, there is an increased emphasis for predictive models that can effectively deal with clinical time-series data. Powered by Recurrent Neural Network (RNN) architectures with Long Short-Term Memory (LSTM) units, deep neural networks have achieved state-of-the-art results in several clinical prediction tasks. Despite the success of RNN, its sequential nature prohibits parallelized computing, thus making it inefficient particularly when processing long sequences. Recently, architectures which are based solely on attention mechanisms have shown remarkable success in transduction tasks in NLP, while being computationally superior. In this paper, for the first time, we utilize attention models for clinical time-series modeling, thereby dispensing recurrence entirely. We develop the SAnD (Simply Attend and Diagnose) architecture, which employs a masked, self-attention mechanism, and uses positional encoding and dense interpolation strategies for incorporating temporal order. Furthermore, we develop a multi-task variant of SAnD to jointly infer models with multiple diagnosis tasks. Using the recent MIMIC-III benchmark datasets, we demonstrate that the proposed approach achieves state-of-the-art performance in all tasks, outperforming LSTM models and classical baselines with hand-engineered features.

Downloads

Published

2018-04-29

How to Cite

Song, H., Rajan, D., Thiagarajan, J., & Spanias, A. (2018). Attend and Diagnose: Clinical Time Series Analysis Using Attention Models. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11635