Non-Stationary long-term time series prediction based on encoding improvements and frequency domain enhancement

WANG Jian-xiao, SHEN Shi-kai, SHE Yu-mei, YANG Bin, HONG Yi, TAO Yu-hu

PDF(1285 KB)
PDF(1285 KB)
Journal of Yunnan University of Nationalities(Natural Sciences Edition) ›› 2025, Vol. 34 ›› Issue (03) : 350-355. DOI: 10.3969/j.issn.1672-8513.2025.03.013

Non-Stationary long-term time series prediction based on encoding improvements and frequency domain enhancement

Author information +
History +

Abstract

To address the issues in the Informer model, which does not account for the non - stationarity and frequency domain information in real - world data, a non - stationary long-term time series prediction model is proposed. The core idea involves encoding improvements and frequency domain enhancement. To restore non - stationary information to temporal dependencies, the model uses the time absolute position encoding to extract interdependencies between time points. Additionally, the frequency domain enhancement with channel attention, based on the discrete cosine transform, adaptively captures the interdependencies between channels in the frequency domain, thereby improving predictability. Experimental results show that, compared to other models, the proposed model achieves an average reduction of 58.4% in mean squared error (MSE) on the dataset, with a maximum reduction of 66.5%.

Key words

Long - term time series prediction / time absolute position encoding / frequency enhanced channel attention / discrete cosine transform

Cite this article

Download Citations
WANG Jian-xiao , SHEN Shi-kai , SHE Yu-mei , et al . Non-Stationary long-term time series prediction based on encoding improvements and frequency domain enhancement. Journal of Yunnan University of Nationalities(Natural Sciences Edition). 2025, 34(03): 350-355 https://doi.org/10.3969/j.issn.1672-8513.2025.03.013

References

1
YE H CHEN J GONG S, et al. ATFNet: adaptive time - frequency ensembled network for long-term time series forecasting[EB/OL]. 2024 - 04 - 08/2025 - 04 - 30.
2
ZHOU H ZHANG S PENG J, et al. Informer: beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 202135(12): 11106 - 11115.
3
OGASAWARA E MARTINEZ L C OLIVEIRA D, et al. Adaptive normalization: a novel data normalization approach for non - stationary time series[C]//The 2010 International Joint Conference on Neural Networks (IJCNN). IEEE, 2010: 1 - 8.
4
PASSALIS N TEFAS A KANNIAINEN J, et al. Deep adaptive input normalization for time series forecasting[J]. IEEE Transactions on Neural Networks and Learning Systems201931(9): 3760 - 3765.
5
KIM T KIM J, TAE Y, et al. Reversible instance normalization for accurate time - series forecasting against distribution shift[C]//ICLR, 2022.
6
LIU Y WU H WANG J, et al. Non-stationary transformers: exploring the stationarity in time series forecasting[J]. Advances in Neural Information Processing Systems202235: 9881 - 9893.
7
ZENG A CHEN M ZHANG L, et al. Are transformers effective for time series forecasting?[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 202337(9): 11121 - 11128.
8
WU H XU J WANG J, et al. Autoformer: decomposition transformers with auto-correlation for long-term series forecasting[C]//Advances in Neural Information Processing Systems, 202134: 22419 - 22430.
9
WU H XU J WANG J, et al. TimesNet: temporal 2D-variation modeling for general time series analysis [J]. arXiv preprint arXiv:2210.02186, 2022.
10
GIBBS J W. Fourier's series[J]. Nature189959(1539): 606.
11
VASWANI A SHAZEER N PARMAR N, et al. Attention is all you need[C]//Advances in Neural Information Processing Systems, 2017: 30.
12
DEVLIN J CHANG M W LEE K, et al. BERT: pre training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 conference of the North American chapter of the Association for Computational Linguistics: Human Language Technologies, 2019: 4171 - 4186.
13
HU J SHEN L ALBANIE S, et al. Squeeze - and - excitation networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2018: 7132 - 7141.
14
JIANG M ZENG P WANG K, et al. FECAM: frequency enhanced channel attention mechanism for time series forecasting[J]. Advanced Engineering Informatics202358: 102158.
15
CHAVES S S LYNFYIELD R LINDEGREN M L, et al. The US influenza hospitalization surveillance network[J]. Emerging Infectious Diseases201521(9): 1543.
16
KITAEV N KAISER L LEVSKAYA A, et al. Reformer: the efficient transformer[J]. arXiv preprint arXiv:2001.044512020: 1 - 12.
17
LAI G CHANG W YANG Y, et al. Modeling long- and short-term temporal patterns with deep neural networks[C]//The 41st international ACM SIGIR Conference on Research & Development in Information Retrieval. 2018: 95 - 104.
18
LI S JIN X XUAN Y, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[J]. Advances in Neural Information Processing Systems201932: 5243 - 5253.

Comments

PDF(1285 KB)

Accesses

Citation

Detail

Sections
Recommended

/