[1] CEKIK R, UYSAL A K. A novel filter feature selection method using rough set for short text data[J]. Expert Systems with Applications, 2020, 160: 113691.
[2] CHEN C W, TSAI Y H, CHANG F R, et al. Ensemble feature selection in medical datasets: combining filter, wrapper, and embedded feature selection results[J]. Expert Systems, 2020, 37(5): e12553.
[3] DROTáR P, GAZDA M, VOKOROKOS L. Ensemble feature selection using election methods and ranker clustering[J]. Information Sciences, 2019, 480: 365-380.
[4] 杨春, 郭健, 张磊, 等. 采用卡方检验的模糊自适应无迹卡尔曼滤波组合导航算法[J]. 控制与决策, 2018, 33(1): 81-87.
YANG C, GUO J, ZHANG L, et al. Fuzzy adaptive unscented Kalman filter integrated navigation algorithm using Chi-square test[J]. Control and Decision, 2018, 33(1): 81-87.
[5] 叶明全, 高凌云, 万春圆, 等. 基于对称不确定性和SVM递归特征消除的信息基因选择方法[J]. 模式识别与人工智能, 2017, 30(5): 429-438.?
YE M Q, GAO L Y, WAN C Y, et al. Informative gene selection method based on symmetric uncertainty and SVM recursive feature elimination[J]. Pattern Recognition and Artificial Intelligence, 2017, 30(5): 429-438.
[6] MEINSHAUSEN N, BüHLMANN P. Stability selection[J]. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2010, 72(4): 417-473.
[7] GONZáLEZ S, GARCíA S, DEL SER J, et al. A practical tutorial on bagging and boosting based ensembles for machine learning: algorithms, software tools, performance study, practical perspectives and opportunities[J]. Information Fusion, 2020, 64: 205-237.
[8] 梁令羽, 孙铭堃, 何为, 等. Bagging-SVM集成分类器估计头部姿态方法[J]. 计算机科学与探索, 2019, 13(11): 1935-1944.
LIANG L Y, SUN M K, HE W, et al. Head pose estimation method of bagging-SVM integrated classifier[J]. Journal of Frontiers of Computer Science and Technology, 2019, 13(11): 1935-1944.
[9] PALEOLOGO G, ELISSEEFF A, ANTONINI G. Sub-agging for credit scoring models[J]. European Journal of Operational Research, 2010, 201(2): 490-499.
[10] SHAH R D, SAMWORTH R J. Variable selection with error control: another look at stability selection[J]. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2013, 75(1): 55-80.
[11] YANG X, WANG Y, WANG R, et al. Ensemble feature selection with block-regularized m×2 cross-validation[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, 34(9): 6628-6641.
[12] BACH F R. Bolasso: model consistent lasso estimation through the bootstrap[C]//Proceedings of the 25th International Conference, Helsinki, Jun 5-9, 2008: 33-40.
[13] ELGHAZEL H, AUSSEM A. Unsupervised feature selection with ensemble learning[J]. Machine Learning, 2015, 98(1/2): 157-180.
[14] HALLAJIAN B, MOTAMENI H, AKBARI E. Ensemble feature selection using distance-based supervised and unsupervised methods in binary classification[J]. Expert Systems with Applications, 2022, 200: 116794.
[15] LIU K, YANG X, YU H, et al. Rough set based semi-supervised feature selection via ensemble selector[J]. Knowledge-Based Systems, 2019, 165: 282-296.
[16] KWOK T Y, YEUNG D Y. Use of bias term in projection pursuit learning improves approximation and convergence properties[J]. IEEE Transactions on Neural Networks, 1996, 7(5): 1168-1183.
[17] RAWAT M, GHANNOUCHI F M, RAWAT K. Three-layered biased memory polynomial for dynamic modeling and predistortion of transmitters with memory[J]. IEEE Transactions on Circuits and Systems I: Regular Papers, 2012, 60(3): 768-777.
[18] SUYKENS J A K, VANDEWALLE J. Least squares support vector machine classifiers[J]. Neural Processing Letters, 1999, 9(3): 293-300.
[19] BERRAR D. Cross-validation[M]//Encyclopedia of Bioinformatics and Computational Biology. Amsterdam: Elsevier, 2019: 542-545.
[20] WANG R, WANG Y, LI J, et al. Block-regularized m×2 cross-validated estimator of the generalization error[J]. Neural Computation, 2017, 29(2): 519-554.
[21] 李赟波, 王士同. 多源域分布下优化权重的迁移学习Boosting方法[J]. 计算机科学与探索, 2023, 17(6): 1441-1452.
LI Y B, WANG S T. Transfer learning Boosting for weight optimization under multi-source domain distribution[J]. Journal of Frontiers of Computer Science and Technology, 2023, 17(6): 1441-1452.
[22] 徐光生, 王士同. 基于潜在的低秩约束的不完整模态迁移学习[J]. 计算机科学与探索, 2022, 16(12): 2775-2787.
XU G S, WANG S T. Incomplete modality transfer learning via latent low-rank constraint[J]. Journal of Frontiers of Computer Science and Technology, 2022, 16(12): 2775-2787.
[23] CHO D, YOO C, IM J, et al. Comparative assessment of various machine learning-based bias correction methods for numerical weather prediction model forecasts of extreme air temperatures in urban areas[J]. Earth and Space Science, 2020, 7(4): e2019EA000740.
[24] LIANG X, ZOU T, GUO B, et al. Assessing Beijing??s PM2.5 pollution: severity, weather impact, APEC and winter heating[J]. Proceedings of the Royal Society A: Mathematical, Phy-sical and Engineering Sciences, 2015, 471(2182): 20150257. |