[1] HU Q H, PAN W W, ZHANG L, et al. Feature selection for monotonic classification[J]. IEEE Transactions on Fuzzy Systems, 2012, 20(1): 69-81.
[2] QIAN W B, SHU W H. Mutual information criterion for feature selection from incomplete data[J]. Neurocomputing, 2015, 168: 210-220.
[3] DASH M, LIU H. Feature selection for classification[J]. In-telligent Data Analysis, 1997, 1: 131-156.
[4] CHANDRASHKAR G, SAHIN F. A survey on feature selection methods[J]. Computers and Electrical Engineering, 2014, 40: 16-28.
[5] HU Q H, PAN W W, SONG Y P, et al. Large-margin feature selection for monotonic classification[J]. Knowledge-Based Systems, 2012, 31: 8-18.
[6] PAN W W, HU Q H, SONG Y P, et al. Feature selection for monotonic classification via maximizing monotonic dependency[J]. International Journal of Computational Intelligence Systems, 2014, 7(3): 543-555.
[7] PAN W W, HU Q H. An improved feature selection algorithm for ordinal classification[J]. IEICE Transactions on Funda-mentals of Electronics Communications and Computer Sciences, 2016, 99-A(12): 2266-2274.
[8] VILLELA S M, LEITE S D C, XAVIER A E, et al. An ordered search with a large margin classifier for feature selection[J]. Applied Soft Computing Journal, 2021, 100: 106930.
[9] ESTéVEZ P A, TESMER M, PEREZ C A, et al. Normalized mutual information feature selection[J]. IEEE Transactions on Neural Networks, 2009, 20(2): 189-201.
[10] VERGARA J R, ESTéVEZ P A. A review of feature selection methods based on mutual information[J]. Neural Computing and Applications, 2014, 24: 175-186.
[11] KWAK N, CHOI C H. Input feature selection by mutual information based on parzen window[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2002, 24(12): 1667-1671.
[12] BATTITI R. Using mutual information for selecting features in supervised neural net learning[J]. IEEE Transactions on Neural Networks, 1994, 5(4): 537-550.
[13] PENG H C, LONG F H, DING C. Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(8): 1226-1238.
[14] 许行, 梁吉业, 王宝丽. 基于双向有序互信息的单调分类决策树算法[J]. 南京大学学报(自然科学), 2013, 49(5): 628-636.
XU H, LIANG J Y, WANG B L. Bi-direction rank mutual information based decision trees for monotonic classification[J]. Journal of Nanjing University (Natural Science), 2013, 49(5): 628-636.
[15] LIN Y J, HU Q H, LIU J H, et al. Streaming feature selection for multilabel learning based on fuzzy mutual information[J]. IEEE Transactions on Fuzzy Systems, 2017, 25(6): 1491-1507.
[16] DAI J H, WANG W T, XU Q. An uncertainty measure for incomplete decision tables and its applications[J]. IEEE Transactions on Cybernetics, 2013, 43(4): 1277-1289.
[17] ZHANG X, MEI C L, CHEN D G, et al. Feature selection in mixed data: a method using a novel fuzzy rough set-based information entropy[J]. Pattern Recognition, 2016, 56: 1-15.
[18] CANO J R, GUTIERREZ P A, KRAWCZYK B, et al. Monotonic classification: an overview on algorithms, perfor-mance measures and data sets[J]. Neurocomputing, 2019, 341: 168-182.
|