[1] 李然, 林政, 林海伦, 等. 文本情绪分析综述[J]. 计算机研究与发展, 2018, 55(1): 30-52.
LI R, LIN Z, LIN H L, et al. Text emotion analysis: a survey[J]. Journal of Computer Research and Development, 2018, 55(1): 30-52.
[2] 刘知远, 孙茂松, 林衍凯, 等. 知识表示学习研究进展[J]. 计算机研究与发展, 2016, 53(2): 247-261.
LIU Z Y, SUN M S, LIN Y K, et al. Knowledge representation learning: a review[J]. Journal of Computer Research and Development, 2016, 53(2): 247-261.
[3] WANG Q, MAO Z, WANG B, et al. Knowledge graph embedding: a survey of approaches and applications[J]. IEEE Transactions on Knowledge and Data Engineering, 2017, 29(12): 2724-2743.
[4] BACKSTROM L, LESKOVEC J. Supervised random walks: predicting and recommending links in social networks[C]//Proceedings of the 4th ACM International Conference on Web Search and Data Mining, Hong Kong, China, Feb 9-12, 2011. New York: ACM, 2011: 635-644.
[5] LESKOVEC J, HUTTENLOCHER D, KLEINBERG J. Predicting positive and negative links in online social networks[C]//Proceedings of the 19th International Conference on World Wide Web, North Carolina, Apr 26-30, 2010. New York: ACM, 2010: 641-650.
[6] WU Z, PAN S, CHEN F, et al. A comprehensive survey on graph neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2020, 32(1): 4-24.
[7] BORDES A, USUNIER N, GARCIA-DURAN A, et al. Translating embeddings for modeling multirelational data[C]//Advances in Neural Information Processing Systems 26, Lake Tahoe, Dec 5-8, 2013: 2787-2795.
[8] ZHOU J, CUI G, HU S, et al. Graph neural networks: a review of methods and applications[J]. AI Open, 2020, 1: 57-81.
[9] KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[J]. arXiv:1609.02907, 2016.
[10] VELI?KOVI? P, CUCURULL G, CASANOVA A, et al. Graph attention networks[J]. arXiv:1710.10903, 2017.
[11] SONG W, XIAO Z, WANG Y, et al. Session-based social recommendation via dynamic graph attention networks[C]//Proceedings of the 12th ACM International Conference on Web Search and Data Mining, Melbourne, Feb 11-15, 2019. New York: ACM, 2019: 555-563.
[12] NATHANI D, CHAUHAN J, SHARMA C, et al. Learning attention-based embeddings for relation prediction in knowledge graphs[J]. arXiv:1906.01195, 2019.
[13] NGUYEN D Q, NGUYEN T D, NGUYEN D Q, et al. A novel embedding model for knowledge base completion based on convolutional neural network[J]. arXiv:1712.02121, 2017.
[14] PANG Y, SUN M, JIANG X, et al. Convolution in convol-ution for network in network[J]. IEEE Transactions on Neural Networks and Learning Systems, 2017, 29(5): 1587-1597.
[15] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[J]. arXiv:1810.04805, 2018.
[16] YU D, WANG H, CHEN P, et al. Mixed pooling for convolutional neural networks[C]//Proceedings of the 2014 International Conference on Rough Sets and Knowledge Technology, Shanghai, Oct 24-26, 2014. Cham: Springer, 2014: 364-375.
[17] WANG Z, ZHANG J, FENG J, et al. Knowledge graph em-bedding by translating on hyperplanes[C]//Proceedings of the 2014 AAAI Conference on Artificial Intelligence, Jul 27-31, 2014. Menlo Park: AAAI, 2014: 1112-1119.
[18] NICKEL M, TRESP V, KRIEGEL H P. Factorizing Yago:scalable machine learning for linked data[C]//Proceedings of the 21st International Conference on World Wide Web, Lyon, Apr 16-20, 2012. New York: ACM, 2012: 271-280.
[19] XIE Z, ZHOU G, LIU J, et al. ReInceptionE: relation-aware inception network with joint local-global structural infor-mation for knowledge graph embedding[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Jul 5-10, 2020. Stroudsburg: ACL, 2020: 5929-5939.
[20] TURC I, CHANG M W, LEE K, et al. Well-read students learn better: on the importance of pre-training compact models[J]. arXiv:1908.08962, 2019.
[21] 陈成, 张皞, 李永强, 等. 关系生成图注意力网络的知识图谱链接预测[J]. 浙江大学学报(工学版), 2022, 56(5): 1025-1034.
CHEN C, ZHANG H, LI Y Q, et al. Knowledge graph link prediction based on relational generative graph attention network[J]. Journal of Zhejiang University (Engineering Science), 2022, 56(5): 1025-1034.
[22] KINGMA D P, BA J. Adam: a method for stochastic optimization[J]. arXiv:1412.6980, 2014.
[23] BOLLACKER K, EVANS C, PARITOSH P, et al. Freebase:a collaboratively created graph database for structuring hu-man knowledge[C]//Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, Vancouver, Jun 9-12, 2008. New York: ACM, 2008: 1247-1250.
[24] MILLER G A. WordNet: a lexical database for English[J]. Communications of the ACM, 1995, 38(11): 39-41.
[25] SONG T, LUO J, HUANG L. Rot-Pro: modeling transitivity by projection in knowledge graph embedding[C]//Advances in?Neural?Information?Processing?Systems 34,?Dec 6-14, 2021: 24695-24706.
[26] YANG B, YIH W, HE X, et al. Embedding entities and relations for learning and inference in knowledge bases[J]. arXiv:1412.6575, 2014.
[27] DETTMERS T, MINERVINI P, STENETORP P, et al. Convolutional 2D knowledge graph embeddings[C]//Proceedings of the 2018 AAAI Conference on Artificial Intelligence, New Orleans, Feb 2-3, 2018. New York: ACM, 2018: 1811-1818.
[28] SHANG C, TANG Y, HUANG J, et al. End-to-end structure-aware convolutional networks for knowledge base completion[C]//Proceedings of the 2019 AAAI Conference on Artificial Intelligence, Hawaii, Jan 27-Feb 1, 2019. Menlo Park: AAAI, 2019: 3060-3067.
[29] YAO L, MAO C, LUO Y. KG-BERT: BERT for knowledge graph completion[J]. arXiv:1909.03193, 2019.
[30] 陈新元, 谢晟祎, 陈庆强, 等. 结合平移关系嵌入和CNN的知识图谱补全[J]. 中文信息学报, 2021, 35(1): 54-63.
CHEN X Y, XIE S Y, CHEN Q Q, et al. Knowledge base completion based on transitional relation embedding via CNN[J]. Journal of Chinese Information Processing, 2021, 35(1): 54-63. |