Ask a Question

Prefer a chat interface with context about you and your work?

Bidirectional Generative Pre-training for Improving Time Series Representation Learning

Bidirectional Generative Pre-training for Improving Time Series Representation Learning

Learning time-series representations for discriminative tasks has been a long-standing challenge. Current pre-training methods are limited in either unidirectional next-token prediction or randomly masked token prediction. We propose a novel architecture called Bidirectional Timely Generative Pre-trained Transformer (BiTimelyGPT), which pre-trains on time-series data by both next-token and previous-token predictions in …