GCformer: An Efficient Framework for Accurate and Scalable Long-Term Multivariate Time Series Forecasting

06/14/2023
by   Yanjun Zhao, et al.
0

Transformer-based models have emerged as promising tools for time series forecasting. However, these model cannot make accurate prediction for long input time series. On the one hand, they failed to capture global dependencies within time series data. On the other hand, the long input sequence usually leads to large model size and high time complexity. To address these limitations, we present GCformer, which combines a structured global convolutional branch for processing long input sequences with a local Transformer-based branch for capturing short, recent signals. A cohesive framework for a global convolution kernel has been introduced, utilizing three distinct parameterization methods. The selected structured convolutional kernel in the global branch has been specifically crafted with sublinear complexity, thereby allowing for the efficient and effective processing of lengthy and noisy input signals. Empirical studies on six benchmark datasets demonstrate that GCformer outperforms state-of-the-art methods, reducing MSE error in multivariate time series benchmarks by 4.38 model parameters by 61.92 serve as a plug-in block to enhance the performance of other models, with an average improvement of 31.93%, including various recently published Transformer-based models. Our code is publicly available at https://github.com/zyj-111/GCformer.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset