[1] TRAN K A, KONDRASHOVA O, BRADLEY A, et al. Deep learning in cancer diagnosis, prognosis and treatment selection[J]. Genome Medicine, 2021, 13(1): 152.
[2] AGGARWAL C C, YU P S. Outlier detection for high dimensional data[C]//Proceedings of the 2001 ACM SIGMOD International Conference on Management of Data. New York: ACM, 2001: 37-46.
[3] BUDA M, MAKI A, MAZUROWSKI M A. A systematic study of the class imbalance problem in convolutional neural networks[J]. Neural Networks, 2018, 106: 249-259.
[4] ZHANG Y F, KANG B Y, HOOI B, et al. Deep long-tailed learning: a survey[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(9): 10795-10816.
[5] HE H B, GARCIA E A. Learning from imbalanced data[J]. IEEE Transactions on Knowledge and Data Engineering, 2009, 21(9): 1263-1284.
[6] WANG Z, ZHU Z H, LI D D. Collaborative and geometric multi-kernel learning for multi-class classification[J]. Pattern Recognition, 2020, 99: 107050.
[7] ZHU Y J, WANG Z, ZHA H Y, et al. Boundary-eliminated pseudoinverse linear discriminant for imbalanced problems[J]. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(6): 2581-2594.
[8] GHAZIKHANI A, YAZDI H S, MONSEFI R. Class imbalance handling using wrapper-based random oversampling[C]//Proceedings of the 20th Iranian Conference on Electrical Engineering. Piscataway: IEEE, 2012: 611-616.
[9] CHAWLA N V, BOWYER K W, HALL L O, et al. SMOTE: synthetic minority over-sampling technique[J]. Journal of Artificial Intelligence Research, 2002, 16: 321-357.
[10] HAN H, WANG W Y, MAO B H. Borderline-SMOTE: a new over-sampling method in imbalanced data sets learning[C]//Proceedings of the 2005 International Conference on Intelligent Computing. Berlin, Heidelberg: Springer, 2005: 878-887.
[11] HE H B, BAI Y, GARCIA E A, et al. ADASYN: adaptive synthetic sampling approach for imbalanced learning[C]//Proceedings of the 2008 IEEE International Joint Conference on Neural Networks. Piscataway: IEEE, 2008: 1322-1328.
[12] GARCíA V, SáNCHEZ J S, MARQUéS A I, et al. Understanding the apparent superiority of over-sampling through an analysis of local information for class-imbalanced data[J]. Expert Systems with Applications, 2020, 158: 113026.
[13] YEN S J, LEE Y S. Cluster-based under-sampling approaches for imbalanced data distributions[J]. Expert Systems with Applications, 2009, 36(3): 5718-5727.
[14] LIU X Y, WU J X, ZHOU Z H. Exploratory undersampling for class-imbalance learning[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 2009, 39(2): 539-550.
[15] KUKAR M, KONONENKO I. Cost-sensitive learning with neural networks[C]//Proceedings of the 13th European Conference on Artificial Intelligence. Hoboken: John Wiley & Sons, 1998: 88-94.
[16] SUN Y M, KAMEL M S, WONG A K C, et al. Cost-sensitive boosting for classification of imbalanced data[J]. Pattern Recognition, 2007, 40(12): 3358-3378.
[17] WONG M L, SENG K, WONG P K. Cost-sensitive ensemble of stacked denoising autoencoders for class imbalance problems in business domain[J]. Expert Systems with Applications, 2020, 141: 112918.
[18] GENG Y, LUO X Y. Cost-sensitive convolutional neural networks for imbalanced time series classification[J]. Intelligent Data Analysis, 2019, 23(2): 357-370.
[19] CHUNG Y A, YANG S W, LIN H T. Cost-sensitive deep learning with layer-wise cost estimation[C]//Proceedings of the 2020 International Conference on Technologies and Applications of Artificial Intelligence. Piscataway: IEEE, 2020: 108-113.
[20] LU Y, CHEUNG Y M, TANG Y Y. Hybrid sampling with bagging for class imbalance learning[C]//Proceedings of the 2016 Pacific-Asia Conference on Knowledge Discovery and Data Mining. Cham: Springer, 2016: 14-26.
[21] CHAWLA N V, LAZAREVIC A, HALL L O, et al. SMOTEBoost: improving prediction of the minority class in boosting[C]//Proceedings of the 7th European Conference on Principles and Practice of Knowledge Discovery in Databases. Berlin, Heidelberg: Springer, 2003: 107-119.
[22] SEIFFERT C, KHOSHGOFTAAR T M, VAN HULSE J, et al. RUSBoost: a hybrid approach to alleviating class imbalance[J]. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 2010, 40(1): 185-197.
[23] FAN W, STOLFO S J, ZHANG J, et al. AdaCost: misclassification cost-sensitive boosting[C]//Proceedings of the 16th International Conference on Machine Learning. San Francisco: Morgan Kaufmann, 1999: 97-105.
[24] LIN T Y, GOYAL P, GIRSHICK R, et al. Focal loss for dense object detection[C]//Proceedings of the 2017 IEEE International Conference on Computer Vision. Piscataway: IEEE, 2017: 2999-3007.
[25] CAO K D, WEI C, GAIDON A, et al. Learning imbalanced datasets with label-distribution-aware margin loss[EB/OL]. [2024-06-17]. https://arxiv.org/abs/1906.07413.
[26] ZHOU B Y, CUI Q, WEI X S, et al. BBN: bilateral-branch network with cumulative learning for long-tailed visual recognition[C]//Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2020: 9716-9725.
[27] CUI Y, JIA M L, LIN T Y, et al. Class-balanced loss based on effective number of samples[C]//Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2019: 9260-9269.
[28] BRONSTEIN M M, BRUNA J, COHEN T, et al. Geometric deep learning: grids, groups, graphs, geodesics, and gauges [EB/OL]. [2024-06-17]. https://arxiv.org/abs/2104.13478.
[29] MONTI F, BOSCAINI D, MASCI J, et al. Geometric deep learning on graphs and manifolds using mixture model CNNs[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2017: 5425-5434.
[30] BRONSTEIN M M, BRUNA J, LECUN Y, et al. Geometric deep learning: going beyond Euclidean data[J]. IEEE Signal Processing Magazine, 2017, 34(4): 18-42.
[31] DONG C, LOY C C, HE K M, et al. Image super-resolution using deep convolutional networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2016, 38(2): 295-307.
[32] GOODFELLOW I J, POUGET-ABADIE J, MIRZA M, et al. Generative adversarial networks[EB/OL]. [2024-06-17]. https:// arxiv.org/abs/1406.2661.
[33] PUJOL O, MASIP D. Geometry-based ensembles: toward a structural characterization of the classification boundary[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2009, 31(6): 1140-1146.
[34] ZHU Z H, WANG Z, LI D D, et al. Geometric structural ensemble learning for imbalanced problems[J]. IEEE Transactions on Cybernetics, 2020, 50(4): 1617-1629.
[35] TORRES L C B, CASTRO C L, COELHO F, et al. Large margin Gaussian mixture classifier with a Gabriel graph geometric representation of data set structure[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(3): 1400-1406.
[36] GHASEMIGOL M, MONSEFI R, YAZDI H S. Ellipse support vector data description[C]//Proceedings of the 11th Inter-national Conference on Engineering Applications of Neural Networks. Berlin, Heidelberg: Springer, 2009: 257-268.
[37] TAX D M J, DUIN R P W. Support vector data description[J]. Machine Learning, 2004, 54(1): 45-66.
[38] KAKDE D, CHAUDHURI A, KONG S, et al. Peak criterion for choosing Gaussian kernel bandwidth in support vector data description[C]//Proceedings of the 2017 IEEE International Conference on Prognostics and Health Management. Piscataway: IEEE, 2017: 32-39.
[39] ZHOU Z H, LIU X Y. Training cost-sensitive neural networks with methods addressing the class imbalance problem[J]. IEEE Transactions on Knowledge and Data Engineering, 2006, 18(1): 63-77.
[40] KIM M J, KANG D K, KIM H B. Geometric mean based boosting algorithm with over-sampling to resolve data imbalance problem for bankruptcy prediction[J]. Expert Systems with Applications, 2015, 42(3): 1074-1082.
[41] ZHANG S C, LI X L, ZONG M, et al. Efficient kNN classification with different numbers of nearest neighbors[J]. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(5): 1774-1785.
[42] WARD JR J H. Hierarchical grouping to optimize an objective function[J]. Journal of the American Statistical Association, 1963, 58(301): 236-244.
[43] WANG Z, DONG Q D, GUO W, et al. Geometric imbalanced deep learning with feature scaling and boundary sample mining[J]. Pattern Recognition, 2022, 126: 108564.
[44] NAPIERA?A K, STEFANOWSKI J, WILK S. Learning from imbalanced data in presence of noisy and borderline examples[C]//Proceedings of the 7th International Conference on Rough Sets and Current Trends in Computing. Berlin, Heidelberg: Springer, 2010: 158-167.
[45] DONG Q, GONG S G, ZHU X T. Class rectification hard mining for imbalanced deep learning[C]//Proceedings of the 2017 IEEE International Conference on Computer Vision. Piscataway: IEEE, 2017: 1869-1878.
[46] HOANG C, CHOI Y, CARLBERG K. Domain-decomposition least-squares Petrov-Galerkin (DD-LSPG) nonlinear model reduction[J]. Computer Methods in Applied Mechanics and Engineering, 2021, 384: 113997.
[47] WOO S, PARK J, LEE J Y, et al. CBAM: convolutional block attention module[C]//Proceedings of the 15th European Conference on Computer Vision. Cham: Springer, 2018: 3-19.
[48]VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Advances in Neural Information Processing Systems 30, 2017: 5998-6008.
[49] NAM H, HA J W, KIM J. Dual attention networks for multimodal reasoning and matching[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2017: 2156-2164.
[50] KRIZHEVSKY A, HINTON G. Learning multiple layers of features from tiny images[EB/OL]. [2024-06-17].https://scholar.google.com.hk/scholar?hl=zh-CN&as_sdt=0%2C5&q=Learning+ multiple+layers+of+features+from+tiny+images&btnG=.
[51] DARLOW L N, CROWLEY E J, ANTONIOU A, et al. CINIC-10 is not ImageNet or CIFAR-10[EB/OL]. [2024-06-20]. https://arxiv.org/abs/1810.03505.
[52] HU J, SHEN L, SUN G. Squeeze-and-excitation networks[C]// Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2018: 7132-7141.
[53] WANG F, JIANG M Q, QIAN C, et al. Residual attention network for image classification[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2017: 6450-6458. |