[1] 陈龙, 管子玉, 何金红, 等. 情感分类研究进展[J]. 计算机研究与发展, 2017, 54(6): 1150-1170.
CHEN L, GUAN Z Y, HE J H, et al. A survey on sentiment classification[J]. Journal of Computer Research and Develop-ment, 2017, 54(6): 1150-1170.
[2] 王婷, 杨文忠. 文本情感分析方法研究综述[J]. 计算机工程与应用, 2021, 57(12): 11-24.
WANG T, YANG W Z. Review of text sentiment analysis methods[J]. Computer Engineering and Applications, 2021, 57(12): 11-24.
[3] GARDNER M, ARTZI Y, BASMOVA V, et al. Evaluating models?? local decision boundaries via contrast sets[C]//Fin-dings of the Association for Computational Linguistics, Nov 16-20, 2020. Stroudsburg: ACL, 2020: 1307-1323.
[4] LI L, MA R, GUO Q, et al. BERT-Attack: adversarial attack against BERT using BERT[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Pro-cessing, Nov 16-20, 2020. Stroudsburg: ACL, 2020: 6193-6202.
[5] MUDRAKARTA P K, TALY A, SUNDARARAJAN M, et al. Did the model understand the question?[C]//Proceedings of the 56th Annual Meeting of the Association for Computa-tional Linguistics, Melbourne, Jul 15-20, 2018. Strouds-burg: ACL, 2018: 1896-1906.
[6] LIU P, FU J, XIAO Y, et al. ExplainaBoard: an explainable leaderboard for NLP[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing: System Demonstrations. Stroudsburg: ACL, 2021: 280-289.
[7] 张燕平, 张铃, 吴涛. 机器学习中的多侧面递进算法 MIDA[J]. 电子学报, 2005, 33(2): 327-331.
ZHANG Y P, ZHANG L, WU T. A multi-side increase by degrees algorithm at machine learning[J]. Acta Electronica Sinica, 2005, 33(2): 327-331.
[8] 王颖洁, 朱久祺, 汪祖民, 等. 自然语言处理在文本情感分析领域应用综述[J]. 计算机应用, 2022, 42(4): 1011-1020.
WANG Y J, ZHU J Q, WANG Z M, et al. Review of applications of natural language processing in text sentiment analysis[J]. Journal of Computer Applications, 2022, 42(4): 1011-1020.
[9] 胡任远, 刘建华, 卜冠南, 等. 融合 BERT 的多层次语义协同模型情感分析研究[J]. 计算机工程与应用, 2021, 57(13): 176-184.
HU R Y, LIU J H, BU G N, et al. Research on sentiment analysis of multi-level semantic collaboration model fused with BERT[J]. Computer Engineering and Applications, 2021, 57(13): 176-184.
[10] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understand-ing[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, Jun 2-7, 2019. Stroudsburg: ACL, 2019: 4171-4186.
[11] XU H, LIU B, SHU L, et al. domBERT: domain-oriented language model for aspect-based sentiment analysis[C]//Findings of the Association for Computational Linguistics, Nov 16-20, 2020. Stroudsburg: ACL, 2020: 1725-1731.
[12] LI Y, HU L, GAO W. Robust sparse and low-redundancy multi-label feature selection with dynamic local and global structure preservation[J]. Pattern Recognition, 2023, 134: 109-120.
[13] LI M, TIAN Z, DU X, et al. Power normalized cepstral robust features of deep neural networks in a cloud comput-ing data privacy protection scheme[J]. Neurocomputing, 2023, 518: 165-173.
[14] GUNEY H, OZTOPRAK H. A robust ensemble feature sele-ction technique for high-dimensional datasets based on min-imum weight threshold method[J]. Computational Intellige-nce, 2022, 38(5): 1616-1658.
[15] YAO Y Y. Three-way decisions with probabilistic rough sets[J]. Information Sciences, 2010, 180(3): 341-353.
[16] 于洪, 杨雪梅. 三支决策在工业大数据中的应用[J]. 西北大学学报 (自然科学版), 2021, 51(4): 505-515.
YU H, YANG X M. Industrial big data applications based on three-way decisions[J]. Journal of Northwest University (Natural Science Edition), 2021, 51(4): 505-515.
[17] CHEN J, CHEN Y, HE Y, et al. A classified feature repre-sentation three-way decision model for sentiment analysis[J]. Applied Intelligence, 2022, 52(7): 7995-8007.
[18] 刘芳, 李天瑞. 一种基于概率粗糙集的属性约简加速算法[J]. 计算机科学, 2016, 43(12): 63-70.
LIU F, LI T R. Accelerated attribute reduction algorithm based on probabilistic rough sets[J]. Computer Science, 2016, 43(12): 63-70.
[19] CHEN Q, ZHANG R, ZHENG Y, et al. Dual contrastive learning: text classification via label-aware data augmenta-tion[J]. arXiv:2201.08702, 2022.
[20] SOCHER R, BAUER J, MANNING C D, et al. Parsing with compositional vector grammars[C]//Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics, Sofia, Aug 4-9, 2013. Stroudsburg: ACL, 2013: 455-465.
[21] DING X, LIU B, YU P S. A holistic lexicon-based approach to opinion mining[C]//Proceedings of the 2008 Interna-tional Conference on Web Search and Data Mining, Palo Alto, Feb 11-12, 2008. New York: ACM, 2008: 231-240.
[22] MANDAVA S, MIGACZ S, FLOREA A F. Pay attention when required[J]. arXiv:2009.04534, 2020.
[23] CHEN H, XIA R, YU J. Reinforced counterfactual data augmentation for dual sentiment classification[C]//Procee-dings of the 2021 Conference on Empirical Methods in Natural Language Processing Nov 7-11, 2021. Stroudsburg: ACL, 2021: 269-278.
[24] GUNEL B, DU J, CONNEAU A, et al. Supervised contras-tive learning for pre-trained language model fine-tuning[J]. arXiv:2011.01403, 2020. |