[1] HARUTYUNYAN H, KHACHATRIAN H, KALE D C, et al. Multitask learning and benchmarking with clinical time series data[J]. Scientific Data, 2019, 6(1): 1-18.
[2] XU D, TIAN Y. A comprehensive survey of clustering algorithms[J]. Annals of Data Science, 2015, 2(2): 165-193.
[3] SUN J G, LIU J, ZHAO L Y. Clustering algorithms research[J]. Journal of Software, 2008, 19(1): 48-61.
孙吉贵, 刘杰, 赵连宇. 聚类算法研究[J]. 软件学报, 2008, 19(1): 48-61.
[4] CHEN Z W, CHANG D X. Automatic clustering algorithm based on density difference[J]. Journal of Software, 2018, 29(4): 935-944.
陈朝威, 常冬霞. 基于密度差分的自动聚类算法[J]. 软件学报, 2018, 29(4): 935-944.
[5] LE Q V. Building high-level features using large scale unsupervised learning[C]//Proceedings of the IEEE 2013 International Conference on Acoustics, Speech and Signal Processing, Vancouver, May 26-31, 2013. Piscataway: IEEE, 2013: 8595-8598.
[6] L?NGKVIST M, KARLSSON L, LOUTFI A. A review of unsupervised feature learning and deep learning for time-series modeling[J]. Pattern Recognition Letters, 2014, 42: 11-24.
[7] HAN Z M, CHEN N, LE J J, et al. An efficient and effective clustering algorithm for time series of hot topics[J]. Chinese Journal of Computers, 2012, 35(11): 2337-2347.
韩忠明, 陈妮, 乐嘉锦, 等. 面向热点话题时间序列的有效聚类算法研究[J]. 计算机学报, 2012, 35(11): 2337-2347.
[8] WU X, ZHU X, WU G Q, et al. Data mining with big data[J]. IEEE Transactions on Knowledge and Data Engineering, 2013, 26(1): 97-107.
[9] SHU K, SLIVA A, WANG S, et al. Fake news detection on social media: a data mining perspective[J]. ACM SIGKDD Explorations Newsletter, 2017, 19(1): 22-36.
[10] EUáN C, OMBAO H, ORTEGA J. The hierarchical spectral merger algorithm: a new time series clustering procedure[J]. Journal of Classification, 2018, 35(1): 71-99.
[11] MARQUES A G, SEGARRA S, LEUS G, et al. Stationary graph processes and spectral estimation[J]. IEEE Transactions on Signal Processing, 2017, 65(22): 5911-5926.
[12] RAHIMIAN H, BAYRAKSAN G, HOMEM-DE-MELLO T. Identifying effective scenarios in distributionally robust stochastic programs with total variation distance[J]. Mathematical Programming, 2019, 173(1/2): 393-430.
[13] AZENCOTT R, MURAVINA V, HEKMATI R, et al. Automatic clustering in large sets of time series[M]//CHETVERUSHKIN B N, FITZGIBBON W, KUZNETSOV Y A, et al. Contributions to Partial Differential Equations and Applications. Berlin, Heidelberg: Springer, 2019: 65-75.
[14] AMIRI M M, GüNDüZ D. Machine learning at the wireless edge: distributed stochastic gradient descent over-the-air[J]. IEEE Transactions on Signal Processing, 2020, 68: 2155-2169.
[15] MANDT S, HOFFMAN M D, BLEI D M. Stochastic gradient descent as approximate Bayesian inference[J]. Journal of Machine Learning Research, 2017, 18(1): 4873-4907.
[16] ZHENG J W, LI Z R, WANG W L, et al. Clustering with joint Laplacian regularization and adaptive feature learning[J]. Journal of Software, 2019, 30(12): 3846-3861.
郑建炜, 李卓蓉, 王万良, 等. 联合Laplacian 正则项和特征自适应的数据聚类算法[J]. 软件学报, 2019, 30(12): 3846-3861.
[17] TANG C, ZHU X, LIU X, et al. Learning a joint affinity graph for multiview subspace clustering[J]. IEEE Transactions on Multimedia, 2018, 21(7): 1724-1736.
[18] ZHANG D Y, ZHOU L H, WU X Y, et al. Data stream clustering based on grid coupling[J]. Journal of Software, 2019, 30(3): 667-683.
张东月, 周丽华, 吴湘云, 等. 基于网格耦合的数据流聚类[J]. 软件学报, 2019, 30(3): 667-683.
[19] ZAKARIA J, MUEEN A, KEOGH E. Clustering time series using unsupervised-shapelets[C]//Proceedings of the 12th IEEE International Conference on Data Mining, Brussels, Dec 10-13, 2012. Washington: IEEE Computer Society, 2012: 785-794.
[20] MADIRAJU N S, SADAT S M, FISHER D, et al. Deep temporal clustering: fully unsupervised learning of time-domain features[J]. arXiv:1802.01059, 2018.
[21] MALININ A, GALES M. Reverse KL-divergence training of prior networks: improved uncertainty and adversarial robustness[C]//Proceedings of the Annual Conference on Neural Information Processing Systems, Vancouver, Dec 8-14, 2019. Red Hook: Curran Associates, 2019: 14520-14531.
[22] ZHU Y M, WAN J C, ZHOU Z M, et al. Triple-to-Text: converting RDF triples into high-quality natural languages via optimizing an inverse KL divergence[C]//Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, Paris, Jul 21-25, 2019. New York: ACM, 2019: 455-464. |