[1] PARMEZAN A R S, SOUZA V M A, BATISTA G E. Towards hierarchical classification of data streams[C]//Proceedings of the 23rd Iberoamerican Congress on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, Madrid, Nov 19-22, 2018: 314-322.
[2] SILLA C N, FREITAS A A. A survey of hierarchical classifica-tion across different application domains[J]. Data Mining and Knowledge Discovery, 2011, 22: 31-72.
[3] OSMANI A, HAMIDI M, ALIZADEH P. Clustering approach to solve hierarchical classification problem complexity[C]// Proceedings of the 36th AAAI Conference on Artificial Intelligence, the 34th Conference on Innovative Applications of Artificial Intelligence, the 12th Symposium on Educational Advances in Artificial Intelligence, Feb 22-Mar 1, 2022: 7904-7912.
[4] BELLMUND J L S, G?RDENFORS P, MOSER E I, et al. Navigating cognition: spatial codes for human thinking[J]. Science, 2018, 362(6415): eaat6766.
[5] ARONOV D, NEVERS R, TANK D. Mapping of a non-spatial dimension by the hippocampal-entorhinal circuit[J]. Nature, 2017, 543: 719-722.
[6] 刘浩阳, 林耀进, 刘景华, 等. 由粗到细的分层特征选择[J]. 电子学报, 2022, 50(11): 2778-2789.
LIU H Y, LIN Y J, LIU J H, et al. Hierarchical feature selection from coarse to fine[J]. Acta Electronica Sinica, 2022, 50(11): 2778-2789.
[7] 林耀进, 白盛兴, 赵红, 等. 基于标签关联性的分层分类共有与固有特征选择[J]. 软件学报, 2022, 33(7): 2667-2682.
LIN Y J, BAI S X, ZHAO H, et al. A label correlation based common and specific feature selection for hierar-chical classification[J]. Journal of Software, 2022, 33(7): 2667-2682.
[8] 梁吉业, 钱宇华, 李德玉, 等. 大数据挖掘的粒计算理论与方法[J].中国科学: 信息科学, 2015, 45(11): 1355-1369.
LIANG J Y, QIAN Y H, LI D Y, et al. Theory and method of granular computing for big data mining[J]. Scientia Sinica: Informationis, 2015, 45(11): 1355-1369.
[9] 王国胤, 傅顺, 杨洁, 等. 基于多粒度认知的智能计算研究[J]. 计算机学报, 2022, 45(6): 1161-1175.
WANG G Y, FU S, YANG J, et al. A review of research on multi-granularity cognition based intelligent computing[J]. Chinese Journal of Computers, 2022, 45(6): 1161-1175.
[10] 王国胤,于洪. 多粒度认知计算——一种大数据智能计算的新模型[J]. 数据与计算发展前沿, 2019, 1(6): 75-85.
WANG G Y, YU H. Multi-granularity cognitive computing—a new model for big data intelligent computing[J]. Frontiers of Data & Computing, 2019, 1(6): 75-85.
[11] MORSI N N, YAKOUT M M. Axiomatics for fuzzy rough sets[J]. Fuzzy Sets and Systems, 1998, 100: 327-342.
[12] DUBOIS D, PRADE H. Rough fuzzy sets and fuzzy rough sets[J]. International Journal of General System, 1990, 17(2/3): 191-209.
[13] JENSEN R, SHEN Q. Fuzzy-rough attribute reduction with application to web categorization[J]. Fuzzy sets and Systems, 2004, 141(3): 469-485.
[14] WANG C, QI Y, SHAO M, et al. A fitting model for feature selection with fuzzy rough sets[J]. IEEE Transactions on Fuzzy Systems, 2016, 25(4): 741-753.
[15] ROSENBERG A, HIRSCHBERG J. V-measure: a conditional entropy-based external cluster evaluation measure[C]//Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Prague, Jun 28-30, 2007: 410-420.
[16] ZHANG X, MEI C, CHEN D, et al. Active incremental feature selection using a fuzzy-rough-set-based information entropy[J]. IEEE Transactions on Fuzzy Systems, 2019, 28(5): 901-915.
[17] JENSEN R, SHEN Q. New approaches to fuzzy-rough feature selection[J]. IEEE Transactions on Fuzzy Systems, 2008, 17(4): 824-838.
[18] CHEN D, ZHANG L, ZHAO S, et al. A novel algorithm for finding reducts with fuzzy rough sets[J]. IEEE Transactions on Fuzzy Systems, 2011, 20(2): 385-389.
[19] YANG Y, CHEN D, WANG H, et al. Incremental perspective for feature selection based on fuzzy rough sets[J]. IEEE Transactions on Fuzzy Systems, 2017, 26(3): 1257-1273.
[20] BHATT R B, GOPAL M. On fuzzy-rough sets approach to feature selection[J]. Pattern Recognition Letters, 2005, 26(7): 965-975.
[21] HU Q, YU D, XIE Z. Information-preserving hybrid data reduction based on fuzzy-rough techniques[J]. Pattern Recog- nition Letters, 2006, 27(5): 414-423.
[22] TSANG E C C, CHEN D, YEUNG D S, et al. Attributes reduction using fuzzy rough sets[J]. IEEE Transactions on Fuzzy Systems, 2008, 16(5): 1130-1141.
[23] ZHAO H, WANG P, HU Q, et al. Fuzzy rough set based feature selection for large-scale hierarchical classification[J]. IEEE Transactions on Fuzzy Systems, 2019, 27(10): 1891-1903.
[24] ZHAO H, HU Q, ZHU P, et al. A recursive regularization based feature selection framework for hierarchical classification[J]. IEEE Transactions on Knowledge and Data Engineering, 2019, 33(7): 2833-2846.
[25] TUO Q, ZHAO H, HU Q. Hierarchical feature selection with subtree based graph regularization[J]. Knowledge-Based Systems, 2019, 163: 996-1008.
[26] ZHENG J, LUO C, LI T, et al. A novel hierarchical feature selection method based on large margin nearest neighbor learning[J]. Neurocomputing, 2022, 497: 1-12.
[27] LIN Y, LIU H, ZHAO H, et al. Hierarchical feature selection based on label distribution learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(6): 5964-5976.
[28] CHEN H, LI T, RUAN D. Maintenance of approximations in incomplete ordered decision systems while attribute values coarsening or refining[J]. Knowledge-Based Systems, 2012, 31: 140-161.
[29] LUO C, LI T, CHEN H, et al. Fast algorithms for computing rough approximations in set-valued decision systems while updating criteria values[J]. Information Sciences, 2015, 299: 221-242.
[30] LUO C, LI T, CHEN H. Dynamic maintenance of approxima-tions in set-valued ordered decision systems under the attribute generalization[J]. Information Sciences, 2014, 257: 210-228.
[31] YANG X, QI Y, YU H, et al. Updating multigranulation rough approximations with increasing of granular structures[J]. Knowledge-Based Systems, 2014, 64: 59-69.
[32] LIU D, LI T, ZHANG J. Incremental updating approximations in probabilistic rough sets under the variation of attributes[J]. Knowledge-Based Systems, 2015, 73: 81-96.
[33] FAN W, HE C, ZENG A, et al. An incremental approach based on hierarchical classification in multikernel fuzzy rough sets under the variation of object set[C]//Proceedings of the 18th International Conference on Intelligent Computing Methodologies, Xi’an, Aug 7-11, 2022. Cham: Springer International Publishing, 2022: 3-17.
[34] LUO C, LI T, CHEN H, et al. Incremental rough set approach for hierarchical multicriteria classification[J]. Information Sciences, 2018, 429: 72-87.
[35] ZADEH L A. Similarity relations and fuzzy orderings[J]. Information Sciences, 1971, 3(2): 177-200.
[36] CHEN D, HU Q, YANG Y. Parameterized attribute reduction with Gaussian kernel based fuzzy rough sets[J]. Information Sciences, 2011, 181(23): 5169-5179.
[37] KOSMOPOULOS A, PARTALAS I, GAUSSIER E, et al. Evaluation measures for hierarchical classification: a unified view and novel approaches[J]. Data Mining and Knowledge Discovery, 2015, 29: 820-865.
[38] AHO A V, HOPCROFT J E, ULLMAN J D. On finding lowest common ancestors in trees[C]//Proceedings of the 5th Annual ACM Symposium on Theory of Computing, Apr 30-May 2, 1973: 253-265.
[39] DEKEL O, KESHET J, SINGER Y. Large margin hierarchical classification[C]//Proceedings of the 21st International Con-ference on Machine Learning, Banff, Jul 4-8, 2004: 27.
[40] JOACHIMS T. Making large-scale SVM learning practical[R]. 1998.
[41] GUO G, WANG H, BELL D, et al. KNN model-based approach in classification[C]//On the Move to Meaningful Internet Systems 2003: CoopIS, DOA, and ODBASE-OTM Confederated International Conferences, Catania, Nov 3-7, 2003: 986-996.
[42] RIGATTI S J. Random forest[J]. Journal of Insurance Medicine, 2017, 47(1): 31-39. |