Less Is More: Generating Time Series with LLaMA-Style Autoregression in Simple Factorized Latent Spaces
- Siyuan Li ,
- Yifan Sun ,
- Lei Cheng ,
- Lewen Wang ,
- Yang Liu ,
- Weiqing Liu ,
- Jianlong Li ,
- Jiang Bian ,
- Shikai Fang
Generative models for multivariate time series are essential for data augmentation, simulation, and privacy preservation, yet current state-of-the-art diffusion-based approaches are slow and limited to fixed-length windows. We propose FAR-TS, a simple yet effective framework that combines disentangled factorization with an autoregressive Transformer over a discrete, quantized latent space to generate time series. Each time series is decomposed into a data-adaptive basis that captures static cross-channel correlations and temporal coefficients that are vector-quantized into discrete tokens. A LLaMA-style autoregressive Transformer then models these token sequences, enabling fast and controllable generation of sequences with arbitrary length. Owing to its streamlined design, FAR-TS achieves orders-of-magnitude faster generation than Diffusion-TS while preserving cross-channel correlations and an interpretable latent space, enabling high-quality and flexible time series synthesis.