[1] Brzezinski D, Stefanowski J. Reacting to different types of concept drift: the accuracy updated ensemble algorithm[J]. IEEE Transactions on Neural Networks and Learning Systems, 2013, 25(1): 81-94.
[2] Krawczyk B, Minku L L, Gama J, et al. Ensemble learning for data stream analysis: a survey[J]. Information Fusion, 2017, 37: 132-156.
[3] Gama J, ?liobait? I, Bifet A, et al. A survey on concept drift adaptation[J]. ACM Computing Surveys, 2014, 46(4): 44.
[4] Kuncheva L I. Classifier ensembles for changing environm-ents[C]//LNCS 3077: Proceedings of the 5th International Workshop on Multiple Classifier Systems, Cagliari, Jun 9-11, 2004. Berlin, Heidelberg: Springer, 2004: 1-15.
[5] Grossberg S. Nonlinear neural networks: principles, mech-anisms, and architectures[J]. Neural Networks, 1988, 1(1): 17-61.
[6] Street W N, Kim Y S. A streaming ensemble algorithm (SEA) for large-scale classification[C]//Proceedings of the 7th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, Aug 26-29, 2001. New York: ACM, 2001: 377-382.
[7] Wang H X, Fan W, Yu P S, et al. Mining concept-drifting data streams using ensemble classifiers[C]//Proceedings of the 9th ACM SIGKDD International Conference on Know-ledge Discovery and Data Mining, Washington, Aug 24-27, 2003. New York: ACM, 2003: 226-235.
[8] Brzezinski D, Stefanowski J. Combining block-based and online methods in learning ensembles from concept drifting data streams[J]. Information Sciences, 2014, 265: 50-67.
[9] Minku L L, Yao X. DDD: a new ensemble approach for dealing with concept drift[J]. IEEE Transactions on Know-ledge and Data Engineering, 2011, 24(4): 619-633.
[10] Khamassi I, Sayed-Mouchaweh M, Hammami M, et al. Ens-emble classifiers for drift detection and monitoring in dyna-mical environments[C]//Proceedings of the 2013 Annual Con-ference of the Prognostics and Health Management Society, New Orlean, Oct 14-17, 2013: 1-14.
[11] Ren Y, Zhang L, Suganthan P N. Ensemble classification and regression-recent developments, applications and future directions[J]. IEEE Computational Intelligence Magazine, 2016, 11(1): 41-53.
[12] Freund Y, Schapire R E. Adecision-theoretic generalization of on-line learning and an application to Boosting[J]. Journal of Computer and System Sciences, 1997, 55(1): 119-139.
[13] Schlimmer J C, Granger R H. Beyond incremental process-ing: tracking concept drift[C]//Proceedings of the 5th National Conference on Artificial Intelligence, Philadelphia, Aug 11-15, 1986. San Mateo: Morgan Kaufmann, 1986: 502-507.
[14] Khamassi I, Sayed-Mouchaweh M, Hammami M, et al. Dis-cussion and review on evolving data streams and concept drift adapting[J]. Evolving Systems, 2018, 9(1): 1-23.
[15] Ditzler G, Roveri M, Alippi C, et al. Learning in nonsta-tionary environments: a survey[J]. IEEE Computational Int-elligence Magazine, 2015, 10(4): 12-25.
[16] Gomes H M, Barddal J P, Enembreck F, et al. A survey on ensemble learning for data stream classification[J]. ACM Computing Surveys, 2017, 50(2): 23.
[17] Minku L L, White A P, Yao X. The impact of diversity on online ensemble learning in the presence of concept drift[J]. IEEE Transactions on Knowledge and Data Engineering, 2009, 22(5): 730-742.
[18] Gama J, Medas P, Castillo G, et al. Learning with drift detec-tion[C]//LNCS 3171: Proceedings of the 17th Brazilian Sym-posium on Artificial Intelligence, São Luis, Sep 29-Oct 1, 2004. Berlin, Heidelberg: Springer, 2004: 286-295.
[19] Baena-Garc?a M, DelCampo-ávila J, Fidalgo R, et al. Early drift detection method[C]//Proceedings of the 4th Interna-tional Workshop on Knowledge Discovery from Data Streams, 2006: 77-86.
[20] Nishida K, Yamauchi K. Detecting concept drift using sta-tistical testing[C]//LNCS 4755: Proceedings of the 10th Inter-national Conference on Discovery Science, Sendai, Oct 1-4, 2007. Berlin, Heidelberg: Springer, 2007: 264-269.
[21] Bifet A, Gavaldà R. Learning from time-changing data with adaptive windowing[C]//Proceedings of the 7th SIAM Inter-national Conference on Data Mining, Minneapolis, Apr 26-28, 2007. Philadelphia: SIAM, 2007: 443-448.
[22] Du L, Song Q B, Jia X L. Detecting concept drift: an infor-mation entropy based method using an adaptive sliding window[J]. Intelligent Data Analysis, 2014, 18(3): 337-364.
[23] Domingos P M, Hulten G. Mining high-speed data streams[C]//Proceedings of the 6th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Bo-ston, Aug 20-23, 2000. New York: ACM, 2000: 71-80.
[24] Hulten G, Spencer L, Domingos P M. Mining time-changing data streams[C]//Proceedings of the 7th ACM SIGKDD Inter-national Conference on Knowledge Discovery and Data Min-ing, San Francisco, Aug 26-29, 2001. New York: ACM, 2001: 97-106.
[25] Gomes H M, Bifet A, Read J, et al. Adaptive random forests for evolving data stream classification[J]. Machine Learning, 2017, 106(9/10): 1469-1495.
[26] Oza N C. Online Bagging and Boosting[C]//Proceedings of the 2005 IEEE International Conference on Systems, Man and Cybernetics, Waikoloa, Oct 10-12, 2005. Piscataway: IEEE, 2005: 2340-2345.
[27] Bifet A, Holmes G, Pfahringer B, et al. New ensemble met-hods for evolving data streams[C]//Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Dis-covery and Data Mining, Paris, Jun 28-Jul 1, 2009. New York: ACM, 2009: 139-148.
[28] Pocock A C, Yiapanis P, Singer J, et al. Online non-stationary Boosting[C]//LNCS 5997: Proceedings of the 9th Interna-tional Workshop on Multiple Classifier Systems, Cairo, Apr 7-9, 2010. Berlin, Heidelberg: Springer, 2010: 205-214.
[29] Kolter J Z, Maloof M A. Dynamic weighted majority: an ensemble method for drifting concepts[J]. Journal of Mach-ine Learning Research, 2007, 8(11): 2755-2790.
[30] Littlestone N, Warmuth M K. The weighted majority algo-rithm[J]. Information and Computation, 1994, 108(2): 212-261.
[31] Brzezinski D, Stefanowski J. Accuracy updated ensemble for data streams with concept drift[C]//LNCS 6679: Proceed-ings of the 6th International Conference on Hybrid Artificial Intelligent Systems, Wroclaw, May 23-25, 2011. Berlin, Hei-delberg: Springer, 2011: 155-163.
[32] Polikar R, Upda L, Upda S S, et al. Learn++: an increm-ental learning algorithm for supervised neural networks[J]. IEEE Transactions on Systems, Man, and Cybernetics: Part C, 2001, 31(4): 497-508.
[33] Elwell R, Polikar R. Incremental learning of concept drift in nonstationary evironments[J]. IEEE Transactions on Neural Networks, 2011, 22(10): 1517-1531.
[34] Scholz M, Klinkenberg R. An ensemble classifier for drift-ing concepts[C]//Proceedings of the 2nd International Work-shop on Knowledge Discovery in Data Streams, 2005: 53-64.
[35] Scholz M. Knowledge-based sampling for subgroup disco-very[C]//LNCS 3539: International Seminar: Local Pattern Detection, Dagstuhl Castle, Apr 12-16, 2004. Berlin, Heidel-berg: Springer, 2005: 171-189.
[36] He H B, Chen S, Li K, et al. Incremental learning from stream data[J]. IEEE Transactions on Neural Networks, 2011, 22(12): 1901-1914.
[37] Chu F, Zaniolo C. Fast and light Boosting for adaptive min-ing of data streams[C]//LNCS 3056: Proceedings of the 8th Pacific-Asia Conference Advances in Knowledge Discovery and Data Mining, Sydney, May 26-28, 2004. Berlin, Heidel-berg: Springer, 2004: 282-292.
[38] Bifet A, Holmes G, Kirkby R, et al. MOA: massive online analysis[J]. Journal of Machine Learning Research, 2010, 11: 1601-1604.
[39] Breiman L, Friedman J H, Olshen R A, et al. Classification and regression trees[M]. Monterey: Wadsworth & Brooks, 1984. |