AI

Generalizable Autoregressive Modeling of Time Series Through Functional Narratives

1 Mins read

Time series data are inherently functions of time, yet current transformers often learn time series by modeling them as mere concatenations of time periods, overlooking their functional properties. In this work, we propose a novel objective for transformers that learn time series by re-interpreting them as temporal functions. We build an alternative sequence of time series by constructing degradation operators of different intensity in the functional space, creating augmented variants of the original sample that are abstracted or simplified to different degrees. Based on the new set of generated sequence, we train an autoregressive transformer that progressively recovers the original sample from the most simplified variant. Analogous to the next word prediction task in languages that learns narratives by connecting different words, our autoregressive transformer aims to learn the Narratives of Time Series (NoTS) by connecting different functions in time. Theoretically, we justify the construction of the alternative sequence through its advantages in approximating functions. When learning time series data with transformers, constructing sequences of temporal functions allows for a broader class of approximable functions (e.g., differentiation) compared to sequences of time periods, leading to a 26% performance improvement in synthetic feature regression experiments. Experimentally, we validate NoTS in 3 different tasks across 22 real-world datasets, where we show that NoTS significantly outperforms other pre-training methods by up to 6%. Additionally, combining NoTS on top of existing transformer architectures can consistently boost the performance. Our results demonstrate the potential of NoTS as a general-purpose dynamic learner, offering a viable alternative for developing foundation models for time series analysis.


Source link

Related posts
AI

IGNN-Solver: A Novel Graph Neural Solver for Implicit Graph Neural Networks

3 Mins read
The most serious challenge regarding IGNNs relates to slow inference speed and scalability. While these networks are effective at capturing long-range dependencies…
AI

Google AI Research Examines Random Circuit Sampling (RCS) for Evaluating Quantum Computer Performance in the Presence of Noise

2 Mins read
Quantum computers are a revolutionary technology that harnesses the principles of quantum mechanics to perform calculations that would be infeasible for classical…
AI

Thinking LLMs: How Thought Preference Optimization Transforms Language Models to Perform Better Across Logic, Marketing, and Creative Tasks

4 Mins read
Large language models (LLMs) have evolved to become powerful tools capable of understanding and responding to user instructions. Based on the transformer…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *