Rough Transformers for Continuous and Efficient Time-Series Modelling

Type: Preprint

Publication Date: 2024-03-15

Citations: 1

DOI: https://doi.org/10.48550/arxiv.2403.10288

Abstract

Time-series data in real-world medical settings typically exhibit long-range dependencies and are observed at non-uniform intervals. In such contexts, traditional sequence-based recurrent models struggle. To overcome this, researchers replace recurrent architectures with Neural ODE-based models to model irregularly sampled data and use Transformer-based architectures to account for long-range dependencies. Despite the success of these two approaches, both incur very high computational costs for input sequences of moderate lengths and greater. To mitigate this, we introduce the Rough Transformer, a variation of the Transformer model which operates on continuous-time representations of input sequences and incurs significantly reduced computational costs, critical for addressing long-range dependencies common in medical contexts. In particular, we propose multi-view signature attention, which uses path signatures to augment vanilla attention and to capture both local and global dependencies in input data, while remaining robust to changes in the sequence length and sampling frequency. We find that Rough Transformers consistently outperform their vanilla attention counterparts while obtaining the benefits of Neural ODE-based models using a fraction of the computational time and memory resources on synthetic and real-world time-series tasks.

Locations

  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ PDF Chat Rough Transformers: Lightweight Continuous-Time Sequence Modelling with Path Signatures 2024 Fernando Moreno-Pino
Álvaro Arroyo
Harrison Waldon
Xiaowen Dong
Álvaro Cartea
+ PDF Chat Rough Transformers: Lightweight Continuous-Time Sequence Modelling with Path Signatures 2024 Fernando Moreno-Pino
Álvaro Arroyo
Harrison Waldon
Xiaowen Dong
Álvaro Cartea
+ TimelyGPT: Recurrent Convolutional Transformer for Long Time-series Representation 2023 Ziyang Song
Qincheng Lu
Hao Xu
Yue Li
+ PDF Chat Attend and Diagnose: Clinical Time Series Analysis Using Attention Models 2018 Huan Song
Deepta Rajan
Jayaraman J. Thiagarajan
Andreas Spanias
+ Modeling Irregular Time Series with Continuous Recurrent Units 2021 Mona Schirmer
Mazin Eltayeb
Stefan Lessmann
Maja Rudolph
+ Attend and Diagnose: Clinical Time Series Analysis using Attention Models 2017 Huan Song
Deepta Rajan
Jayaraman J. Thiagarajan
Andreas Spanias
+ PDF Chat TrajGPT: Irregular Time-Series Representation Learning for Health Trajectory Analysis 2024 Ziyang Song
Qingcheng Lu
He Zhu
David L. Buckeridge
Yue Li
+ PDF Chat Bidirectional Generative Pre-training for Improving Time Series Representation Learning 2024 Ziyang Song
Qincheng Lu
Zhu He
Yue Li
+ PDF Chat CARRNN: A Continuous Autoregressive Recurrent Neural Network for Deep Representation Learning From Sporadic Temporal Data 2022 Mostafa Mehdipour Ghazi
Lauge Sørensen
Sébastien Ourselin
Mads Nielsen
+ PDF Chat ContiFormer: Continuous-Time Transformer for Irregular Time Series Modeling 2024 Yuqi Chen
Kan Ren
Yansen Wang
Yuchen Fang
Weiwei Sun
Dongsheng Li
+ CARRNN: A Continuous Autoregressive Recurrent Neural Network for Deep Representation Learning from Sporadic Temporal Data 2021 Mostafa Mehdipour Ghazi
Lauge Sørensen
Sébastien Ourselin
Mads Nielsen
+ Modeling Irregular Time Series with Continuous Recurrent Units 2021 Mona Schirmer
Mazin Eltayeb
Stefan Lessmann
Maja Rudolph
+ Finding Short Signals in Long Irregular Time Series with Continuous-Time Attention Policy Networks 2023 Thomas Hartvigsen
Jidapa Thadajarassiri
Xiangnan Kong
Elke A. Rundensteiner
+ Correlated Attention in Transformers for Multivariate Time Series 2023 Minh Quang Nguyen
Lam M. Nguyen
Subhro Das
+ DuETT: Dual Event Time Transformer for Electronic Health Records 2023 Alex Labach
Aslesha Pokhrel
Xiao Shi Huang
Saba Zuberi
Seung Eun Yi
Maksims Volkovs
Tomi Poutanen
Rahul G. Krishnan
+ Interpretable Additive Recurrent Neural Networks For Multivariate Clinical Time Series 2021 Asif Rahman
Yale Chang
Jonathan Rubin
+ Self-Supervised Transformer for Sparse and Irregularly Sampled Multivariate Clinical Time-Series 2021 Sindhu Tipirneni
Chandan K. Reddy
+ Learning Long-Term Dependencies in Irregularly-Sampled Time Series 2020 Mathias Lechner
Ramin Hasani
+ PDF Chat WaveGNN: Modeling Irregular Multivariate Time Series for Accurate Predictions 2024 Arash Hajisafi
Maria Despoina Siampou
Bita Azarijoo
Cyrus Shahabi
+ PDF Chat PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting 2024 Yongbo Yu
Weizhong Yu
Feiping Nie
Xuelong Li

Works Cited by This (0)

Action Title Year Authors