[1] YU M, YIN W P, HASAN K S, et al. Improved neural relation detection for knowledge base question answering[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2017: 571-581.
[2] TRISEDYA B D, WEIKUM G, QI J Z, et al. Neural relation extraction for knowledge base enrichment[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2019: 229-240.
[3] MIN B N, GRISHMAN R, WAN L, et al. Distant supervision for relation extraction with an incomplete knowledge base[C]//Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg: ACL, 2013: 777-782.
[4] LESTER B, AL-RFOU R, CONSTANT N, et al. The power of scale for parameter-efficient prompt tuning[EB/OL]. [2024-05-16]. https://arxiv.org/abs/2104.08691.
[5] GAO T Y, FISCH A, CHEN D Q. Making pre-trained language models better few-shot learners[EB/OL]. [2024-05-16]. https://arxiv.org/abs/2012.15723.
[6] HU S D, DING N, WANG H D, et al. Knowledgeable prompt-tuning: incorporating knowledge into prompt verbalizer for text classification[EB/OL]. [2024-05-16]. https://arxiv.org/abs/2108.02035.
[7] GU Y X, HAN X, LIU Z Y, et al. PPT: pre-trained prompt tuning for few-shot learning[EB/OL]. [2024-05-16]. https://arxiv.org/abs/2109.04332.
[8] ZHU Y, WANG Y, QIANG J P, et al. Prompt-learning for short text classification[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 36(10): 5328-5339.
[9] 唐媛, 陈艳平, 扈应, 等. 一种面向关系抽取的表填充依赖特征学习方法[J]. 计算机工程与应用, 2024, 60(13): 143-151.
TANG Y, CHEN Y P, HU Y, et al. Dependency feature learning method for table filling for relation extraction[J]. Computer Engineering and Applications, 2024, 60(13): 143-151.
[10] ZHOU G D, SU J, ZHANG J, et al. Exploring various knowledge in relation extraction[C]//Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics. Stroudsburg: ACL, 2005: 427-434.
[11] ZENG D J, LIU K, LAI S W, et al. Relation classification via convolutional deep neural network[C]//Proceedings of the 25th International Conference on Computational Linguistics, 2014: 2335-2344.
[12] GENG Z Q, CHEN G F, HAN Y M, et al. Semantic relation extraction using sequential and tree-structured LSTM with attention[J]. Information Sciences, 2020, 509: 183-192.
[13] ZHANG N Y, DENG S M, SUN Z L, et al. Attention-based capsule networks with dynamic routing for relation extraction[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2018: 986-992.
[14] ZHANG Y H, ZHONG V, CHEN D Q, et al. Position-aware attention and supervised data improve slot filling[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2017.
[15] SOARES L B, FITZGERALD N, LING J, et al. Matching the blanks: distributional similarity for relation learning[EB/OL]. [2024-05-16]. https://arxiv.org/abs/1906.03158.
[16] SCHICK T, SCHüTZE H. Exploiting cloze questions for few shot text classification and natural language inference[EB/OL]. [2024-05-21]. https://arxiv.org/abs/2001.07676.
[17] 陈德光, 马金林, 马自萍, 等. 自然语言处理预训练技术综述[J]. 计算机科学与探索, 2021, 15(8): 1359-1389.
CHEN D G, MA J L, MA Z P, et al. Review of pre-training techniques for natural language processing[J]. Journal of Frontiers of Computer Science and Technology, 2021, 15(8): 1359-1389.
[18] RADFORD A, NARASIMHAN K, SALIMANS T, et al. Improving language understanding by generative pre-training[EB/OL]. [2024-05-21]. https://arxiv.org/abs/2005.14165.
[19] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1. Stroudsburg: ACL, 2019: 4171-4186.
[20] LIU Y H, OTT M, GOYAL N, et al. RoBERTa: a robustly optimized BERT pretraining approach[EB/OL]. [2024-05-21]. https://arxiv.org/abs/1907.11692.
[21] COLIN R, NOAM S, ADAM R, et al. Exploring the limits of transfer learning with a unified text-to-text transformer[J]. Journal of Machine Learning Research, 2020, 21: 5485-5551.
[22] BALDINI SOARES L, FITZGERALD N, LING J, et al. Matching the blanks: distributional similarity for relation learning[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2019: 2895-2905.
[23] PETERS M E, NEUMANN M, LOGAN R, et al. Knowledge enhanced contextual word representations[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg: ACL, 2019.
[24] WU S C, HE Y F. Enriching pre-trained language model with entity information for relation classification[EB/OL]. [2024-05-21]. https://arxiv.org/abs/1905.08284.
[25] LIU X, ZHENG Y N, DU Z X, et al. GPT understands, too[J]. AI Open, 2024, 5: 208-215.
[26] HAN X, ZHAO W L, DING N, et al. PTR: prompt tuning with rules for text classification[J]. AI Open, 2022, 3: 182-192.
[27] CHEN X, ZHANG N Y, XIE X, et al. KnowPrompt: knowledge-aware prompt-tuning with synergistic optimization for relation extraction[C]//Proceedings of the ACM Web Conference 2022. New York: ACM, 2022: 2778-2788.
[28] JOSHI M, CHEN D Q, LIU Y H, et al. SpanBERT: improving pre-training by representing and predicting spans[J]. Transactions of the Association for Computational Linguistics, 2020, 8: 64-77.
[29] HUGUET CABOT P L, NAVIGLI R. REBEL: relation extraction by end-to-end language generation[C]//Findings of the Association for Computational Linguistics: EMNLP 2021. Stroudsburg: ACL, 2021: 2370-2381.
[30] NIE W Z, REN M J, NIE J, et al. C-GCN: correlation based graph convolutional network for audio-video emotion recognition[J]. IEEE Transactions on Multimedia, 2020, 23: 3793-3804.
[31] ALT C, HüBNER M, HENNIG L, et al. Improving relation extraction by pre-trained language representations[EB/OL]. [2024-05-21]. https://arxiv.org/abs/1906.03088.
[32] GUO Z J, ZHANG Y, LU W. Attention guided graph convolutional networks for relation extraction[EB/OL]. [2024-05-21]. https://arxiv.org/abs/1906.07510.
[33] XUE F Z, SUN A X, ZHANG H, et al. GDPNet: refining latent multi-view graph for relation extraction[J]. Proceedings of the AAAI Conference on Artificial Intelligence, 2021, 35(16): 14194-14202.
[34] XIE H, ZHANG Y J, HE Y, et al. Parallel attention-based LSTM for building a prediction model of vehicle emissions using PEMS and OBD[J]. Measurement, 2021, 185: 110074.
[35] CRISDAYANTI I A P A, BAK J, CHOI Y, et al. IA-BERT: context-aware sarcasm detection by incorporating incongruity attention layer for feature extraction[C]//Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computing. New York: ACM, 2022: 1084-1091. |