计算机科学与探索 ›› 2025, Vol. 19 ›› Issue (1): 45-64.DOI: 10.3778/j.issn.1673-9418.2403070
孟祥福,石皓源
出版日期:
2025-01-01
发布日期:
2024-12-31
MENG Xiangfu, SHI Haoyuan
Online:
2025-01-01
Published:
2024-12-31
摘要: 时序数据预测(TSF)是指通过分析历史数据的趋势性、季节性等潜在信息,预测未来时间点或时间段的数值和趋势。时序数据由传感器生成,在金融、医疗、能源、交通、气象等众多领域都发挥着重要作用。随着物联网传感器的发展,海量的时序数据难以使用传统的机器学习解决,而Transformer在自然语言处理和计算机视觉等领域的诸多任务表现优秀,学者们利用Transformer模型有效捕获长期依赖关系,使得时序数据预测任务取得了飞速发展。综述了基于Transformer模型的时序数据预测方法,按时间梳理了时序数据预测的发展进程,系统介绍了时序数据预处理过程和方法,介绍了常用的时序预测评价指标和数据集。以算法框架为研究内容系统阐述了基于Transformer的各类模型在TSF任务中的应用方法和工作原理。通过实验对比了各个模型的性能、优点和局限性,并对实验结果展开了分析与讨论。结合Transformer模型在时序数据预测任务中现有工作存在的挑战提出了该方向未来发展趋势。
孟祥福, 石皓源. 基于Transformer模型的时序数据预测方法综述[J]. 计算机科学与探索, 2025, 19(1): 45-64.
MENG Xiangfu, SHI Haoyuan. Survey of Transformer-Based Model for Time Series Forecasting[J]. Journal of Frontiers of Computer Science and Technology, 2025, 19(1): 45-64.
[1] WEN Q S, YANG L X, ZHOU T, et al. Robust time series analysis and applications: an industrial perspective[C]//Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York:ACM, 2022: 4836-4837. [2] ESLING P, AGON C. Time-series data mining[J]. ACM Computing Surveys, 2012, 45(1): 1-34. [3] LIN H T, GAO Z Y, XU Y J, et al. Conditional local convolution for spatio-temporal meteorological forecasting[C]//Proceedings of the 2022 AAAI Conference on Artificial Intelligence. Menlo Park: AAAI, 2022: 7470-7478. [4] ZHANG L, WILSON R, SUMNER M, et al. Advanced multimodal fusion method for very short-term solar irradiance forecasting using sky images and meteorological data: a gate and transformer mechanism approach[J]. Renewable Energy, 2023, 216: 118952. [5] LU P, YE L, PEI M, et al. Short-term wind power forecasting based on meteorological feature extraction and optimization strategy[J]. Renewable Energy, 2022, 184: 642-661. [6] WU H, LIANG Y, ZUO J Y. Human-inspired spatiotemporal feature extraction and fusion network for weather forecasting[J]. Expert Systems with Applications, 2022, 207: 118089. [7] LAPTEV N, YOSINSKI J, LI L E, et al. Time-series extreme event forecasting with neural networks at uber[C]// Proceedings of the 34th International Conference on Machine Learning, Sydney, Aug 6-11, 2017: 1-5. [8] LV Y S, DUAN Y J, KANG W W, et al. Traffic flow prediction with big data: a deep learning approach[J]. IEEE Transactions on Intelligent Transportation Systems, 2014, 16(2): 865-873. [9] LI Y G, YU R, SHABABI C, et al. Diffusion convolutional recurrent neural network: data-driven traffic forecasting[C]//Proceedings of the 35th International Conference on Machine Learning, Stockholmsmässan, Jul 10-15, 2018. [10] SHAO H, SOONG B H. Traffic flow prediction with long short-term memory networks (LSTMs)[C]//Proceedings of the 2016 IEEE Region 10 Conference. Piscataway: IEEE, 2016: 2986-2989. [11] 王永恒, 高慧, 陈炫伶. 采用变结构动态贝叶斯网络的交通流量预测[J]. 计算机科学与探索, 2017, 11(4): 528-538. WANG Y H, GAO H, CHEN X L. Traffic prediction method using structure varying dynamic Bayesian networks[J]. Journal of Frontiers of Computer Science and Technology, 2017, 11(4): 528-538. [12] ROJO-ALVAREZ J L, MARTINEZ-RAMON M, DE PRADO-CUMPLIDO M, et al. Support vector method for robust ARMA system identification[J]. IEEE Transactions on Signal Processing, 2004, 52(1): 155-164. [13] CALLOT L A F, KOCK A B, MEDEIROS M C. Modeling and forecasting large realized covariance matrices and portfolio choice[J]. Journal of Applied Econometrics, 2017, 32(1): 140-158. [14] CALLOT L, CANER M, ONDER A O, et al. A nodewise regression approach to estimating large portfolios[J]. Journal of Business & Economic Statistics, 2021, 39(2): 520-531. [15] LUO R, ZHANG W N, XU X J, et al.A neural stochastic volatility model[C]//Proceedings of the 32nd AAAI Conference on Artificial Intelligence, the 30th Innovative Applications of Artificial Intelligence, and the 8th AAAI Symposium on Educational Advances in Artificial Intelligence.Menlo Park: AAAI, 2018: 6401-6408. [16] BALLESTRA L V, GUIZZARDI A, PALLADINI F. Forecasting and trading on the VIX futures market: a neural network approach based on open to close returns and coincident indicators[J]. International Journal of Forecasting, 2019, 35(4): 1250-1262. [17] YAN X, ZHANG W Z, MA L, et al. Parsimonious quantile regression of financial asset tail dynamics via sequential learning[C]//Advances in Neural Information Processing Systems 31, Montréal, Dec 3-8, 2018: 1582-1592. [18] 赵洪科, 吴李康, 李徵, 等. 基于深度神经网络结构的互联网金融市场动态预测[J]. 计算机研究与发展, 2019, 56(8): 1621-1631. ZHAO H K, WU L K, LI Z, et al. Predicting the dynamics in internet finance based on deep neural network structure[J]. Journal of Computer Research and Development, 2019, 56(8): 1621-1631. [19] LUKSZA M, LASSIG M. A predictive fitness model for influenza[J]. Nature, 2014, 507(7490): 57-61. [20] BOYLE J R, SPARKS R S, KEIJZERS G B, et al. Prediction and surveillance of influenza epidemics[J]. Medical Journal of Australia, 2011, 194: S28-S33. [21] BORKENHAGEN L K, ALLEN M W, RUNSTADLER J A. Influenza virus genotype to phenotype predictions through machine learning: a systematic review: computational prediction of influenza phenotype[J]. Emerging Microbes & Infections, 2021, 10(1): 1896-1907. [22] LEE E K, TIAN H, NAKAYA H I. Antigenicity prediction and vaccine recommendation of human influenza virus A (H3N2) using convolutional neural networks[J]. Human Vaccines & Immunotherapeutics, 2020, 16(11): 2690-2708. [23] ZHAO H C, ZHANG X Y, ZHAO Q C, et al. MSDRP: a deep learning model based on multisource data for predicting drug response[J]. Bioinformatics, 2023, 39(9): btad514. [24] GERDES H, CASADO P, DOKAL A, et al. Drug ranking using machine learning systematically predicts the efficacy of anti-cancer drugs[J]. Nature Communications, 2021, 12(1): 1850. [25] LYU T, GAO J L, TIAN L, et al. MDNN: a multimodal deep neural network for predicting drug-drug interaction events[C]//Proceedings of the 30th International Joint Conference on Artificial Intelligence, Aug 19-27, 2021: 3536-3542. [26] DIMOULKAS I, MAZIDI P, HERRE L. Neural networks for GEFCom2017 probabilistic load forecasting[J]. International Journal of Forecasting, 2019, 35(4): 1409-1423. [27] SAXENA H, APONTE O, MCCONKY K T. A hybrid machine learning model for forecasting a billing period’s peak electric load days[J]. International Journal of Forecasting, 2019, 35(4): 1288-1303. [28] SMYL S, HUA N G. Machine learning methods for GEFCom2017 probabilistic load forecasting[J]. International Journal of Forecasting, 2019, 35(4): 1424-1431. [29] GERS F A, SCHMIDHUBER J, CUMMINS F. Learning to forget: continual prediction with LSTM[J]. Neural computation, 2000, 12(10): 2451-2471. [30] PASCANU R, MIKOLOV T, BENGIO Y. On the difficulty of training recurrent neural networks[C]//Proceedings of the 30th International Conference on Machine Learning, Atlanta, Jun 16-21, 2013: 1310-1318. [31] CHO K, VAN MERRIENBOER B, GULCEHRE C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2014: 1724-1734. [32] DURBIN J, KOOPMAN S J. Time series analysis by state space methods[M]. 2nd ed. Oxford:Oxford University Press, 2012. [33] 康梦轩, 宋俊平, 范鹏飞, 等. 基于深度学习的网络流量预测研究综述[J]. 计算机工程与应用, 2021, 57(10): 1-9. KANG M X, SONG J P, FAN P F, et al. Survey of network traffic forecast based on deep learning[J]. Computer Engineering and Applications, 2021, 57(10): 1-9. [34] ELMAN J L. Finding structure in time[J]. Cognitive Science, 1990, 14(2): 179-211. [35] HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural Computation, 1997, 9(8): 1735-1780. [36] CHUNG J Y, GULCEHRE C, CHO K, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling[EB/OL]. [2024-01-14]. https://arxiv.org/abs/1412. 3555. [37] LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998, 86(11): 2278-2324. [38] ROSENBLATT F. The Perceptron: a probabilistic model for information storage and organization in the brain[J]. Psychological Review, 1958, 65(6): 386. [39] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Advances in Neural Information Processing Systems 30, Long Beach, Dec 4-9, 2017: 5998-6008. [40] WEN Q S, ZHOU T, ZHANG C L, et al. Transformers in time series: a survey[C]//Proceedings of the 32nd International Joint Conference on Artificial Intelligence, Macao, China, Aug 19-25, 2023: 6778-6786. [41] 梁宏涛, 刘硕, 杜军威, 等. 深度学习应用于时序预测研究综述[J]. 计算机科学与探索, 2023, 17(6): 1285-1300. LIANG H T, LIU S, DU J W, et al. Review of deep learning applied to time series prediction[J]. Journal of Frontiers of Computer Science and Technology, 2023, 17(6): 1285-1300. [42] RAMIREZ-GALLEGO S, KRAWCZYK B, GARCIA S, et al. A survey on data preprocessing for data stream mining: current status and future directions[J]. Neurocomputing, 2017, 239: 39-57. [43] KOTSIANTIS S B, KANELLOPOULOS D, PINTELAS P E. Data preprocessing for supervised leaning[J]. International Journal of Computer Science, 2006, 1(2): 111-117. [44] 刘明吉, 王秀峰, 黄亚楼. 数据挖掘中的数据预处理[J]. 计算机科学, 2000, 27(4): 54-57. LIU M J, WANG X F, HUANG Y L. Data preprocessing in data mining[J]. Computer Science, 2000, 27(4): 54-57. [45] GARCIA S, RAMIREZ-GALLEGO S, LUENGO J, et al. Big data preprocessing: methods and prospects[J]. Big Data Analytics, 2016, 1(1): 1-22. [46] TAM S, TSAO M S, MCPHERSON J D. Optimization of miRNA-seq data preprocessing[J]. Briefings in Bioinformatics, 2015, 16(6): 950-963. [47] BILAL M, ALI G, IQBAL M W, et al. Auto-prep: efficient and automated data preprocessing pipeline[J]. IEEE Access, 2022, 10: 107764-107784. [48] HU F, LAKDAWALA S, HAO Q, et al. Low-power, intelligent sensor hardware interface for medical data preprocessing[J]. IEEE Transactions on Information Technology in Biomedicine, 2009, 13(4): 656-663. [49] HURTIK P, MOLEK V, HULA J. Data preprocessing technique for neural networks based on image represented by a fuzzy function[J]. IEEE Transactions on Fuzzy Systems, 2019, 28(7): 1195-1204. [50] SRIVASTAVA S. Weka: a tool for data preprocessing, classification, ensemble, clustering and association rule mining[J]. International Journal of Computer Applications, 2014, 88(10). [51] HUANG M W, LIN W C, CHEN C W, et al. Data preprocessing issues for incomplete medical datasets[J]. Expert Systems, 2016, 33(5): 432-438. [52] DASU T, JOHNSON T. Exploratory data mining and data cleaning[M]. Hoboken: John Wiley & Sons, 2003. [53] 曹勇, 于海. 基于重要度计算的物联网时序大数据智能清洗算法[J]. 自动化与仪器仪表, 2023(12): 71-75. CAO Y, YU H. Intelligent data cleaning algorithm for IoT time series big data based on importance calculation[J]. Automation & Instrumentation, 2023(12): 71-75. [54] 韩京宇, 陈伟, 赵静, 等. 基于异常特征模式的心电数据标签清洗方法[J]. 计算机研究与发展, 2023, 60(11): 2594-2610. HAN J Y, CHEN W, ZHAO J, et al. A label cleaning method of ECG data based on abnormality-feature patterns [J]. Journal of Computer Research and Development, 2023, 60(11): 2594-2610. [55] KIRCHNER K, ZEC J, DELIBASIC B. Facilitating data preprocessing by a generic framework: a proposal for clustering[J]. Artificial Intelligence Review, 2016, 45: 271-297. [56] SATHYANARAYANAN A, MUELLER T T, MONI M A, et al. Multi-omics data integration methods and their applications in psychiatric disorders[J]. European Neuropsychopharmacology, 2023, 69: 26-46. [57] ZHENG Y T, LIU Y Q, YANG J C, et al. Multi-omics data integration using ratio-based quantitative profiling with Quartet reference materials[J]. Nature Biotechnology, 2024,42: 1133-1149. [58] 陈超, 胡才亮, 崔钰, 等. 基于时空聚类的多源异构时序数据集成方法[J]. 电子设计工程, 2023, 31(20): 168-171. CHEN C, HU C L, CUI Y, et al. Multi⁃source heterogeneous time series data integration method based on spatiotemporal clustering[J]. Electronic Design Engineering, 2023, 31(20): 168-171. [59] MARKERT K N, WILLIAMS G P, NELSON E J, et al. Dense time series generation of surface water extents through optical-SAR sensor fusion and gap filling[J]. Remote Sensing, 2024, 16(7): 1262. [60] KARIMVAND S K, JAFARI J M, ZADE S V, et al. Practical and comparative application of efficient data reduction-multivariate curve resolution[J]. Analytica Chimica Acta, 2023, 1243: 340824. [61] ESCHRICH S, KE J W, HALL L O, et al. Fast accurate fuzzy clustering through data reduction[J]. IEEE Transactions on Fuzzy Systems, 2003, 11(2): 262-270. [62] 崔振, 任亚洲, 王瑞. 基于DCT的时序数据相似性搜索[J]. 计算机应用, 2007(5): 1232-1234. CUI Z, REN Y Z, WANG R. Similarity search over time series data using DCT[J]. Journal of Computer Applications, 2007(5): 1232-1234. [63] 刘洪波, 盖雪扬, 孙黎, 等. 计及数据降维和数据清洗的超短期风电功率预测[J]. 东北电力大学学报, 2023, 43(4): 1-9. LIU H B, GAI X Y, SUN L, et al. Ultra-short-term wind power prediction considering data reduction and data cleaning[J]. Journal of Northeast Electric Power University, 2023, 43(4): 1-9. [64] KESTER W. Data conversion handbook[M]. Newnes, 2005. [65] OGASAWARA E, MARTINEZ L C, DE OLIVEIRA D, et al. Adaptive normalization: a novel data normalization approach for non-stationary time series[C]//Proceedings of the 2010 International Joint Conference on Neural Networks. Piscataway: IEEE, 2010: 1-8. [66] WANG Z, BOVIK A C. Mean squared error: love it or leave it? A new look at signal fidelity measures[J]. IEEE Signal Processing Magazine, 2009, 26(1): 98-117. [67] HODSON T O. Root mean square error (RMSE) or mean absolute error (MAE): when to use or not[J]. Geoscientific Model Development, 2014, 7(3): 1247-1250. [68] WILLMOTT C J, MATSUURA K. Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance[J]. Climate Research, 2005, 30(1): 79-82. [69] DE MYTTENAERE A, GOLDEN B, LE GRAND B, et al. Mean absolute percentage error for regression models[J]. Neurocomputing, 2016, 192: 38-48. [70] HABYARIMANA J B. Forecasting crop production: a seasonal regression model decomposition of MAPE and SMAPE[J]. Journal of Statistical Science and Application, 2014, 2: 203-212. [71] CAMERON A C, WINDMEIJER F A G. An R-squared measure of goodness of fit for some common nonlinear regression models[J]. Journal of Econometrics, 1997, 77(2): 329-342. [72] CHICCO D, WARRENS M J, JURMAN G. The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE, MSE and RMSE in regression analysis evaluation[J]. PeerJ Computer Science, 2021, 7: e623. [73] CHOROWSKI J K, BAHDANAU D, SERDYUK D, et al. Attention-based models for speech recognition[C]//Advances in Neural Information Processing Systems 28, Montreal, Dec 7-12, 2015: 577-585. [74] BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[C]// Proceedings of the 3rd International Conference on Learning Representations, San Diego, May 7-9, 2015. [75] SONG H, RAJAN D, THIAGARAJAN J, et al. Attend and diagnose: clinical time series analysis using attention models[C]//Proceedings of the 2018 AAAI Conference on Artificial Intelligence. Menlo Park: AAAI, 2018: 4091-4098. [76] JIN K H, WI J A, LEE E J, et al. TrafficBERT: pre-trained model with large-scale data for long-range traffic flow forecasting[J]. Expert Systems with Applications, 2021, 186: 115738. [77] ZERVEAS G, JAYARAMAN S, PATEL D, et al. A transformer- based framework for multivariate time series representation learning[C]//Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York: ACM, 2021: 2114-2124. [78] LI S Y, JIN X Y, XUAN Y, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[C]//Advances in Neural Information Processing Systems 32, Vancouver, Dec 8-14, 2019: 5244-5254. [79] ZHOU H Y, ZHANG S H, PENG J Q, et al. Informer: beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the 2021 AAAI conference on Artificial Intelligence. Menlo Park: AAAI, 2021: 11106- 11115. [80] WU H X, XU J H, WANG J M, et al. Autoformer: decomposition transformers with auto-correlation for long-term series forecasting[C]//Advances in Neural Information Processing Systems 34, Dec 6-14, 2021: 22419-22430. [81] ZHOU T, MA Z Q, WEN Q S, et al. Fedformer: frequency enhanced decomposed transformer for long-term series forecasting[C]//Proceedings of the 39th International Conference on Machine Learning, Baltimore, Jul 17-23, 2022: 27268-27286. [82] LIU S Z, YU H, LIAO C, et al. Pyraformer: low-complexity pyramidal attention for long-range time series modeling and forecasting[C]//Proceedings of the 10th International Conference on Learning Representations, Apr 25-29, 2022. [83] ZHANG Y F, WU R, DASCALU S M, et al. Multi-scale transformer pyramid networks for multivariate time series forecasting[J]. IEEE Access, 2024, 12: 14731-14741. [84] LIU Y, WU H X, WANG J M, et al. Non-stationary transformers: exploring the stationarity in time series forecasting[C]//Advances in Neural Information Processing Systems 35, New Orleans, Nov 28-Dec 9, 2022: 9881-9893. [85] LIU Y, HU T, ZHANG H, et al. iTransformer: inverted transformers are effective for time series forecasting[C]// Proceedings of the 12th International Conference on Learning Representations, Vienna, May 7-11, 2024. [86] SHABANI A, ABDI A, MENG L L, et al. Scaleformer: iterative multi-scale refining transformers for time series forecasting[C]//Proceedings of the 11th International Conference on Learning Representations, Kigali, May 1-5, 2023. [87] ZHANG Z W, MENG L H, GU Y T. SageFormer: series-aware framework for long-term multivariate time series forecasting[J]. IEEE Internet of Things Journal, 2024,11(10): 18435-18448. [88] CIRSTEA R G, GUO C J, YANG B, et al. Triformer: triangular variable-specific attentions for long sequence multivariate time series forecasting[C]//Proceedings of the 31st International Joint Conference on Artificial Intelligence, Vienna, Jul 23-29, 2022: 1994-2001. [89] ZHANG X Y, JIN X Y, GOPALSWAMY K, et al. First de-trend then attend: rethinking attention for time-series forecasting[EB/OL]. [2024-01-14]. https://arxiv.org/abs/2212.08151. [90] NIE Y Q, NGUYEN N H, SINTHONG P, et al. A time series is worth 64 words: long-term forecasting with transformers[C]//Proceedings of the 11th International Conference on Learning Representations, Kigali, May 1-5, 2023. [91] LI Y, LU X J, XIONG H Y, et al. Towards long-term time-series forecasting: feature, pattern, and distribution[C]//Proceedings of the 39th IEEE International Conference on Data Engineering. Piscataway: IEEE, 2023: 1611-1624. [92] ZHANG Y H, YAN J C. Crossformer: transformer utilizing cross-dimension dependency for multivariate time series forecasting[C]//Proceedings of the 11th International Conference on Learning Representations, Kigali, May 1-5, 2023. [93] CHEN P, ZHANG Y Y, CHENG Y Y, et al. Pathformer: multi-scale transformers with adaptive pathways for time series forecasting[C]//Proceedings of the 12th International Conference on Learning Representations, Vienna, May 7-11, 2024. [94] GRUVER N, FINZI M, QIU S K, et al. Large language models are zero-shot time series forecasters[C]//Advances in Neural Information Processing Systems 36, New Orleans, Dec 10-16, 2023. [95] JIN M, WANG S Y, MA L T, et al. Time-LLM: time series forecasting by reprogramming large language models[C]// Proceedings of the 12th International Conference on Learning Representations, Vienna, May 7-11, 2024. [96] RASUL K, ASHOK A, WILLIAMS A R, et al. Lag-Llama: towards foundation models for time series forecasting[C]//Advances in Neural Information Processing Systems 36, New Orleans, Dec 10-16, 2023. [97] WU Z H, LIU Z J, LIN J, et al. Lite Transformer with long-short range attention[C]//Proceedings of the 8th International Conference on Learning Representations, Addis Ababa, Apr 26-30, 2020. [98] MEHTA S, GHAZVININEJAD M, IYER S, et al. DeLighT: deep and light-weight transformer[C]//Proceedings of the 9th International Conference on Learning Representations, Austria, May 3-7, 2021. [99] XIN J, TANG R, LEE J, et al. DeeBERT: dynamic early exiting for accelerating BERT inference[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2020: 2246-2251. |
[1] | 程佳琳, 袁得嵛, 孙泽宇, 陈梓彦. 基于SDNE嵌入表达的深度学习跨网络身份关联方法[J]. 计算机科学与探索, 2025, 19(2): 417-428. |
[2] | 马倩, 董武, 曾庆涛, 张艳, 陆利坤, 周子镱. 图像重定向及客观质量评价方法综述[J]. 计算机科学与探索, 2025, 19(2): 316-333. |
[3] | 陈冲, 朱啸宇, 王芳, 许雅倩, 张伟. 物理引导的深度学习研究综述:进展、挑战和展望[J]. 计算机科学与探索, 2025, 19(2): 277-294. |
[4] | 郭佳霖, 智敏, 殷雁君, 葛湘巍. 图像处理中CNN与视觉Transformer混合模型研究综述[J]. 计算机科学与探索, 2025, 19(1): 30-44. |
[5] | 杨晨, 徐昊, 朱佳伟, 吴秦, 柴志雷. LightGCNet:基于轻量化卷积网络的深度色域压缩算法[J]. 计算机科学与探索, 2025, 19(1): 196-210. |
[6] | 王永威, 魏德健, 曹慧, 姜良. 深度学习在心力衰竭检测中的应用综述[J]. 计算机科学与探索, 2025, 19(1): 65-78. |
[7] | 杨梅君, 姚若侠, 谢娟英. CARFB:即插即用的目标检测模块[J]. 计算机科学与探索, 2025, 19(1): 223-236. |
[8] | 连哲, 殷雁君, 智敏, 徐巧枝. 自然场景文本检测中可微分二值化技术综述[J]. 计算机科学与探索, 2024, 18(9): 2239-2260. |
[9] | 李子奇, 苏宇轩, 孙俊, 张永宏, 夏庆锋, 尹贺峰. 基于深度学习的多聚焦图像融合方法前沿进展[J]. 计算机科学与探索, 2024, 18(9): 2276-2292. |
[10] | 方博儒, 仇大伟, 白洋, 刘静. 表面肌电信号在肌肉疲劳研究中的应用综述[J]. 计算机科学与探索, 2024, 18(9): 2261-2275. |
[11] | 叶庆文, 张秋菊. 采用通道像素注意力的多标签图像识别[J]. 计算机科学与探索, 2024, 18(8): 2109-2117. |
[12] | 汪有崧, 裴峻鹏, 李增辉, 王伟. 深度学习的视网膜血管分割研究综述[J]. 计算机科学与探索, 2024, 18(8): 1960-1978. |
[13] | 侯鑫, 王艳, 王绚, 范伟. 全景影像在城市研究中的应用进展综述[J]. 计算机科学与探索, 2024, 18(7): 1661-1682. |
[14] | 陈东洋, 毛力. 融合增量学习与Transformer模型的股价预测研究[J]. 计算机科学与探索, 2024, 18(7): 1889-1899. |
[15] | 韩涵, 黄训华, 常慧慧, 樊好义, 陈鹏, 陈姞伽. 心电领域中的自监督学习方法综述[J]. 计算机科学与探索, 2024, 18(7): 1683-1704. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||