[1] 杨屹东. 建设部 推动“六精四化”巩固精进 助力数智化坚强电网建设[J]. 华北电业, 2024(2): 18.
YANG Y D. The Ministry of Construction promotes the consolidation and improvement of “six finesse and four modernizations” to facilitate the construction of a digitally intelligent and strong power grid[J]. North China Power, 2024(2): 18.
[2] 孙梦晨, 丛伟, 余江, 等. 电网运维大数据背景下的继电保护通信系统故障定位方法[J]. 电力自动化设备, 2019, 39(4): 141-147.
SUN M C, CONG W, YU J, et al. Fault locating method based on big data of power grid operation and maintenance for relay protection communication system[J]. Electric Power Automation Equipment, 2019, 39(4): 141-147.
[3] 冯斌, 张又文, 唐昕, 等. 基于BiLSTM-Attention神经网络的电力设备缺陷文本挖掘[J]. 中国电机工程学报, 2020, 40(S1): 1-10.
FENG B, ZHANG Y W, TANG X, et al. Power equipment defect record text mining based on BiLSTM-Attention neural network[J]. Proceedings of the CSEE, 2020, 40(S1): 1-10.
[4] YAO L, MAO C, LUO Y. Graph convolutional networks for text classification[C]//Proceedings of the 33rd AAAI Conference on Artificial Intelligence, Honolulu, Jan 27-Feb 1, 2019.?Palo Alto: AAAI, 2019: 7370-7377.
[5] LIN Y, MENG Y, SUN X, et al. BertGCN: transductive text classification by combining GCN and BERT[EB/OL]. [2024-03-12]. https://arxiv.org/abs/2105.05727.
[6] 刘峤, 李杨, 段宏, 等. 知识图谱构建技术综述[J]. 计算机研究与发展, 2016, 53(3): 582-600.
LIU Q, LI Y, DUAN H, et al. Knowledge graph construction techniques[J]. Journal of Computer Research and Development, 2016, 53(3): 582-600.
[7] 戴宇欣, 张俊, 季知祥, 等. 基于功能缺陷文本的电力系统二次设备智能诊断与辅助决策[J]. 电力自动化设备, 2021, 41(6): 184-194.
DAI Y X, ZHANG J, JI Z X, et al. Intelligent diagnosis and auxiliary decision of power system secondary equipment based on functional defect text[J]. Electric Power Automation Equipment, 2021, 41(6): 184-194.
[8] 张鹤译, 王鑫, 韩立帆, 等. 大语言模型融合知识图谱的问答系统研究[J]. 计算机科学与探索, 2023, 17(10): 2377-2388.
ZHANG H Y, WANG X, HAN L F, et al. Research on question answering system on joint of knowledge graph and large language models[J]. Journal of Frontiers of Computer Science and Technology, 2023, 17(10): 2377-2388.
[9] 陈慧敏, 刘知远, 孙茂松. 大语言模型时代的社会机遇与挑战[J]. 计算机研究与发展, 2024, 61(5): 1094-1103.
CHEN H M, LIU Z Y, SUN M S. The social opportunities and challenges in the era of large language models[J]. Journal of Computer Research and Development, 2024, 61(5): 1094-1103.
[10] LEWIS P, PEREZ E, PIKYUS A, et al. Retrieval-augmented generation for knowledge-intensive NLP tasks[C]//Advances in Neural Information Processing Systems 33, 2020: 9459-9474.
[11] 严昊, 刘禹良, 金连文, 等. 类ChatGPT大模型发展, 应用和前景[J]. 中国图象图形学报, 2023, 28(9): 2749-2762.
YAN H, LIU Y L, JIN L W, et al. The development, application, and future of LLM similar to ChatGPT[J]. Journal of Image and Graphics, 2023, 28(9): 2749-2762.
[12] VELICKOVIC P, CUCURULL G, CASANOVA A, et al. Graph attention networks[EB/OL]. [2024-03-12]. https://arxiv.org/abs/1710.10903.
[13] LIU Y, OTT M, GOYAL N, et al. RoBERTa: a robustly optimized BERT pretraining approach[EB/OL]. [2024-03-12]. https://arxiv.org/abs/1907.11692.
[14] ZHENG Y, ZHANG R, ZHANG J, et al. LlamaFactory: unified efficient fine-tuning of 100+ language models[EB/OL]. [2024-03-12]. https://arxiv.org/abs/2403.13372.
[15] BAI Y, DU X, LIANG Y, et al. COIG-CQIA: quality is all you need for Chinese instruction fine-tuning[EB/OL]. [2024-03-12]. https://arxiv.org/abs/2403.18058.
[16] KIM Y. Convolutional neural networks for sentence classification[EB/OL]. [2024-03-12]. https://arxiv.org/abs/1408.5882.
[17] LIU P, QIU X, HUANG X. Recurrent neural network for text classification with multi-task learning[EB/OL]. [2024-03-12]. https://arxiv.org/abs/1605.05101.
[18] LYU S F, LIU J Q. Convolutional recurrent neural networks for text classification[J]. Journal of Database Management, 2021, 32(4): 65-82.
[19] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding [EB/OL]. [2024-03-12]. https://arxiv.org/abs/1810.04805.
[20] DETTMERS T, PAGNONI A, HOLTZMAN A, et al. QLoRA: efficient finetuning of quantized LLMs[C]//Advances in Neural Information Processing Systems 36, 2024. |