基于编码改进和频域增强的非平稳长时间序列预测

王鉴潇, 申时凯, 佘玉梅, 杨斌, 洪燚, 陶玉虎

PDF(1285 KB)
PDF(1285 KB)
云南民族大学学报(自然科学版) ›› 2025, Vol. 34 ›› Issue (03) : 350-355. DOI: 10.3969/j.issn.1672-8513.2025.03.013
信息与计算机科学

基于编码改进和频域增强的非平稳长时间序列预测

作者信息 +

Non-Stationary long-term time series prediction based on encoding improvements and frequency domain enhancement

Author information +
History +

摘要

针对Informer模型未考虑现实数据的非平稳性和频域信息的问题,提出了一种非平稳长时间序列预测模型,核心思想是编码改进和频域增强,为了非平稳信息恢复到时间依赖性中,时间绝对编码器用于提取时间点的相互依赖;同时,通过离散余弦变换的频域增强通道注意力机制,自适应地捕捉通道之间在频域中的相互依赖性,提高了预测性能.实验结果表明,相较于其他模型,所提模型在数据集上的均方误差(MSE)平均下降58.4%,最高下降66.5%.

Abstract

To address the issues in the Informer model, which does not account for the non - stationarity and frequency domain information in real - world data, a non - stationary long-term time series prediction model is proposed. The core idea involves encoding improvements and frequency domain enhancement. To restore non - stationary information to temporal dependencies, the model uses the time absolute position encoding to extract interdependencies between time points. Additionally, the frequency domain enhancement with channel attention, based on the discrete cosine transform, adaptively captures the interdependencies between channels in the frequency domain, thereby improving predictability. Experimental results show that, compared to other models, the proposed model achieves an average reduction of 58.4% in mean squared error (MSE) on the dataset, with a maximum reduction of 66.5%.

关键词

长时间序列预测 / 时间绝对编码器 / 频域增强通道注意力 / 离散余弦变换

Key words

Long - term time series prediction / time absolute position encoding / frequency enhanced channel attention / discrete cosine transform

中图分类号

TP391.41

引用本文

导出引用
王鉴潇 , 申时凯 , 佘玉梅 , . 基于编码改进和频域增强的非平稳长时间序列预测. 云南民族大学学报(自然科学版). 2025, 34(03): 350-355 https://doi.org/10.3969/j.issn.1672-8513.2025.03.013
WANG Jian-xiao, SHEN Shi-kai, SHE Yu-mei, et al. Non-Stationary long-term time series prediction based on encoding improvements and frequency domain enhancement[J]. Journal of Yunnan University of Nationalities(Natural Sciences Edition). 2025, 34(03): 350-355 https://doi.org/10.3969/j.issn.1672-8513.2025.03.013

参考文献

1
YE H CHEN J GONG S, et al. ATFNet: adaptive time - frequency ensembled network for long-term time series forecasting[EB/OL]. 2024 - 04 - 08/2025 - 04 - 30.
2
ZHOU H ZHANG S PENG J, et al. Informer: beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 202135(12): 11106 - 11115.
3
OGASAWARA E MARTINEZ L C OLIVEIRA D, et al. Adaptive normalization: a novel data normalization approach for non - stationary time series[C]//The 2010 International Joint Conference on Neural Networks (IJCNN). IEEE, 2010: 1 - 8.
4
PASSALIS N TEFAS A KANNIAINEN J, et al. Deep adaptive input normalization for time series forecasting[J]. IEEE Transactions on Neural Networks and Learning Systems201931(9): 3760 - 3765.
5
KIM T KIM J, TAE Y, et al. Reversible instance normalization for accurate time - series forecasting against distribution shift[C]//ICLR, 2022.
6
LIU Y WU H WANG J, et al. Non-stationary transformers: exploring the stationarity in time series forecasting[J]. Advances in Neural Information Processing Systems202235: 9881 - 9893.
7
ZENG A CHEN M ZHANG L, et al. Are transformers effective for time series forecasting?[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 202337(9): 11121 - 11128.
8
WU H XU J WANG J, et al. Autoformer: decomposition transformers with auto-correlation for long-term series forecasting[C]//Advances in Neural Information Processing Systems, 202134: 22419 - 22430.
9
WU H XU J WANG J, et al. TimesNet: temporal 2D-variation modeling for general time series analysis [J]. arXiv preprint arXiv:2210.02186, 2022.
10
GIBBS J W. Fourier's series[J]. Nature189959(1539): 606.
11
VASWANI A SHAZEER N PARMAR N, et al. Attention is all you need[C]//Advances in Neural Information Processing Systems, 2017: 30.
12
DEVLIN J CHANG M W LEE K, et al. BERT: pre training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 conference of the North American chapter of the Association for Computational Linguistics: Human Language Technologies, 2019: 4171 - 4186.
13
HU J SHEN L ALBANIE S, et al. Squeeze - and - excitation networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2018: 7132 - 7141.
14
JIANG M ZENG P WANG K, et al. FECAM: frequency enhanced channel attention mechanism for time series forecasting[J]. Advanced Engineering Informatics202358: 102158.
15
CHAVES S S LYNFYIELD R LINDEGREN M L, et al. The US influenza hospitalization surveillance network[J]. Emerging Infectious Diseases201521(9): 1543.
16
KITAEV N KAISER L LEVSKAYA A, et al. Reformer: the efficient transformer[J]. arXiv preprint arXiv:2001.044512020: 1 - 12.
17
LAI G CHANG W YANG Y, et al. Modeling long- and short-term temporal patterns with deep neural networks[C]//The 41st international ACM SIGIR Conference on Research & Development in Information Retrieval. 2018: 95 - 104.
18
LI S JIN X XUAN Y, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[J]. Advances in Neural Information Processing Systems201932: 5243 - 5253.

基金

国家自然科学基金(61962033)
国家自然科学基金(62372076)

评论

PDF(1285 KB)

Accesses

Citation

Detail

段落导航
相关文章

/