[1] ZHOU J, CUI G, HU S, et al. Graph neural networks: a review of methods and applications[J]. AI Open, 2020, 1: 57-81.
[2] ZENG H, ZHOU H, SRIVASTAVA A, et al. Graphsaint: graph sampling based inductive learning method[EB/OL]. [2023-08-25]. https://arxiv.org/abs/1907.04931.
[3] KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[EB/OL]. [2023-08-25]. https://arxiv.org/abs/1609.02907.
[4] HAMILTON W, YING Z, LESKOVEC J. Inductive representation learning on large graphs[C]//Advances in Neural Information Processing Systems 30, Long Beach, Dec 4-9, 2017: 1024-1034.
[5] ZHENG X, LIU Y, PAN S, et al. Graph neural networks for graphs with heterophily: a survey[EB/OL]. [2023-08-25]. https://arxiv.org/abs/2202.07082.
[6] MAEHARA T. Revisiting graph neural networks: all we have is low-pass filters[EB/OL]. [2023-08-25]. https://arxiv.org/abs/1905.09550.
[7] YAN Y, HASHEMI M, SWERSKY K, et al. Two sides of the same coin: heterophily and oversmoothing in graph convolutional neural networks[C]//Proceedings of the 2022 IEEE International Conference on Data Mining. Piscataway: IEEE, 2022: 1287-1292.
[8] MOSTAFA H, NASSAR M, MAJUMDAR S. On local aggregation in heterophilic graphs[EB/OL]. [2023-08-25]. https://arxiv.org/abs/2106.03213.
[9] ZHENG X, ZHANG M, CHEN C, et al. Auto-heg: automated graph neural network on heterophilic graphs[EB/OL]. [2023-08-25]. https://arxiv.org/abs/2302.12357.
[10] SONG Y, ZHOU C, WANG X, et al. Ordered GNN: ordering message passing to deal with heterophily and over-smoothing[EB/OL]. [2023-08-25]. https://arxiv.org/abs/2302.01524.
[11] LIU Z, DOU Y, YU P S, et al. Alleviating the inconsistency problem of applying graph neural network to fraud detection[C]//Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2020: 1569-1572.
[12] DOU Y, LIU Z, SUN L, et al. Enhancing graph neural network-based fraud detectors against camouflaged fraudsters[C]//Proceedings of the 29th ACM International Conference on Information & Knowledge Management. New York: ACM, 2020: 315-324.
[13] HUANG X, YANG Y, WANG Y, et al. Dgraph: a large-scale financial dataset for graph anomaly detection[C]//Advances in Neural Information Processing Systems 35, New Orleans, Nov 28-Dec 9, 2022: 22765-22777.
[14] WEBER M, DOMENICONI G, CHEN J, et al. Anti-money laundering in bitcoin: experimenting with graph convolutional networks for financial forensics[EB/OL]. [2023-08-25]. https://arxiv.org/abs/1908.02591.
[15] HU Z, DONG Y, WANG K, et al. Heterogeneous graph transformer[C]//Proceedings of the Web Conference 2020. New York: ACM, 2020: 2704-2710.
[16] ZHAO J, WANG X, SHI C, et al. Heterogeneous graph structure learning for graph neural networks[C]//Proceedings of the 2021 AAAI Conference on Artificial Intelligence. Menlo Park: AAAI, 2021: 4697-4705.
[17] WANG X, JI H, SHI C, et al. Heterogeneous graph attention network[C]//Proceedings of the 2019 World Wide Web Conference. New York: ACM, 2019: 2022-2032.
[18] GILMER J, SCHOENHOLZ S S, RILEY P F, et al. Neural message passing for quantum chemistry[C]//Proceedings of the 34th International Conference on Machine Learning, Sydney, Aug 6-11, 2017: 1263-1272.
[19] WANG Y, DERR T. Tree decomposed graph neural network[C]//Proceedings of the 30th ACM International Conference on Information & Knowledge Management. New York: ACM, 2021: 2040-2049.
[20] JIN W, DERR T, WANG Y, et al. Node similarity preserving graph convolutional networks[C]//Proceedings of the 14th ACM International Conference on Web Search and Data Mining. New York: ACM, 2021: 148-156.
[21] ABU-EL-HAIJA S, PEROZZI B, KAPOOR A, et al. Mixhop: higher-order graph convolutional architectures via sparsified neighborhood mixing[C]//Proceedings of the 36th International Conference on Machine Learning, Long Beach, Jun 9-15, 2019: 21-29.
[22] ZHU J, YAN Y, ZHAO L, et al. Beyond homophily in graph neural networks: current limitations and effective designs[C]//Advances in Neural Information Processing Systems 33, Dec 6-12, 2020: 7793-7804.
[23] PEI H, WEI B, CHANG K C C, et al. Geom-GCN: geometric graph convolutional networks[EB/OL]. [2023-08-25]. https://arxiv.org/abs/2002.05287.
[24] LIU M, WANG Z, JI S. Non-local graph neural networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 44(12): 10270-10276.
[25] YUAN H, JI S. Node2Seq: towards trainable convolutions in graph neural networks[EB/OL]. [2023-08-25]. https://arxiv.org/abs/2101.01849.
[26] YANG T, WANG Y, YUE Z, et al. Graph pointer neural networks[C]//Proceedings of the 2022 AAAI Conference on Artificial Intelligence. Menlo Park: AAAI, 2022: 8832-8839.
[27] BO D, WANG X, SHI C, et al. Beyond low-frequency information in graph convolutional networks[C]//Proceedings of the 2021 AAAI Conference on Artificial Intelligence. Menlo Park: AAAI, 2021: 3950-3957.
[28] LUAN S, HUA C, LU Q, et al. Is heterophily a real nightmare for graph neural networks to do node classification?[EB/OL]. [2023-08-25]. https://arxiv.org/abs/2109.05641.
[29] SURESH S, BUDDE V, NEVILLE J, et al. Breaking the limit of graph neural networks by improving the assortativity of graphs with local mixing patterns[C]//Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York: ACM, 2021: 1541-1551.
[30] JIN D, YU Z, HUO C, et al. Universal graph convolutional networks[C]//Advances in Neural Information Processing Systems 34, Dec 6-14, 2021: 10654-10664.
[31] XU K, LI C, TIAN Y, et al. Representation learning on graphs with jumping knowledge networks[C]//Proceedings of the 35th International Conference on Machine Learning, Stockholmsmässan, Jul 10-15, 2018: 5453-5462.
[32] CHEN M, WEI Z, HUANG Z, et al. Simple and deep graph convolutional networks[C]//Proceedings of the 37th International Conference on Machine Learning, Jul 13-18, 2020: 1725-1735.
[33] KONG L, CHEN Y, ZHANG M. Geodesic graph neural network for efficient graph representation learning[C]//Advances in Neural Information Processing Systems 35, New Orleans, Nov 28-Dec 9, 2022: 5896-5909.
[34] LI P, WANG Y, WANG H, et al. Distance encoding: design provably more powerful neural networks for graph representation learning[C]//Advances in Neural Information Processing Systems 33, Dec 6-12, 2020: 4465-4478.
[35] SHERVASHIDZE N, SCHWEITZER P, VAN LEEUWEN E J, et al. Weisfeiler-Lehman graph kernels[J]. Journal of Machine Learning Research, 2011, 12(9): 2539-2561.
[36] DAI E, JIN W, LIU H, et al. Towards robust graph neural networks for noisy graphs with sparse labels[C]//Proceedings of the 15th ACM International Conference on Web Search and Data Mining. New York: ACM, 2022: 181-191.
[37] SHI F, CAO Y, SHANG Y, et al. H2-FDetector: a GNN-based fraud detector with homophilic and heterophilic connections[C]//Proceedings of the ACM Web Conference 2022. New York: ACM, 2022: 1486-1494.
[38] ZAKNICH A. Introduction to the modified probabilistic neural network for general signal processing applications[J]. IEEE Transactions on Signal Processing, 1998, 46(7): 1980-1990.
[39] WU F, SOUZA A, ZHANG T, et al. Simplifying graph convolutional networks[C]//Proceedings of the 36th International Conference on Machine Learning, Long Beach, Jun 9-15, 2019: 6861-6871.
[40] VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks[EB/OL]. [2023-08-25]. https://arxiv. org/abs/1710.10903. |