Journal of Frontiers of Computer Science and Technology ›› 2018, Vol. 12 ›› Issue (5): 785-793.DOI: 10.3778/j.issn.1673-9418.1705027
Previous Articles Next Articles
FENG Chang, LIAO Shizhong+
Online:
Published:
冯 昌,廖士中+
Abstract: Large-scale kernel methods are main machine learning methods for current big data analysis and mining. Kernel methods search the reproducing kernel Hilbert spaces (RKHSs) for some hypotheses to handle original nonlinear learning problems, which requires quadratic time complexity with respect to the size of training set and makes predicting with a kernelized procedure on the entire training set. Hence kernel methods are unpractical for large-scale datasets. To address these issues, this paper proposes an efficient approach to large-scale kernel methods with random hypothesis spaces. Firstly, the circulant random feature mapping is used to construct random hypothesis spaces, called circulant random hypothesis spaces (CRHSs), in loglinear time with respect to the dimensionality of input data. Then, in CRHSs, off-the-shelf linear algorithms are applied to train linear model, which has linear or sublinear time complexity with respect to the size of training set. Theoretically, a uniform generalization error bound in CRHS is presented, which also converges to the generalization error of the optimal hypothesis in RKHS with high probability. The experimental results on benchmark datasets demonstrate that the proposed approach improves the efficiency of non-linear kernel methods significantly while guaranteeing prediction accuracies. The proposed approach is theoretically guaranteed and computationally efficient, exhibiting a state-of-the-art approach to large-scale kernel methods.
Key words: kernel methods, circulant random feature mapping, random hypothesis space, linear learning algorithm, large-scale kernel methods
摘要: 大规模核方法是大规模数据分析与挖掘的基本机器学习方法。核方法在再生核希尔伯特空间中训练线性学习器求解样本空间中的非线性问题,求解时间复杂度关于数据规模是平方级的,预测也依赖于整个训练数据,因而不适用于大规模学习问题。针对这些问题,提出了大规模核方法的有效随机假设空间方法。首先,在关于样本维度对数时间复杂度内,应用循环随机特征映射显式构造假设空间,该空间称之为循环随机假设空间。然后,在循环随机假设空间中应用线性或亚线性学习算法训练线性模型。理论上,给出了循环随机假设空间的一致泛化误差上界及其相对于最优泛化误差的收敛性。实验结果表明,大规模核方法的随机假设空间方法不仅能够显著地提高非线性核方法的训练与预测效率,而且能够保持与非线性核方法相当的预测精度。该方法有理论保障,计算复杂度低,运行效率高,是当前最高效的大规模核方法实现方法。
关键词: 核方法, 循环随机特征映射, 随机假设空间, 线性学习算法, 大规模核方法
FENG Chang, LIAO Shizhong. Large-Scale Kernel Methods via Random Hypothesis Spaces[J]. Journal of Frontiers of Computer Science and Technology, 2018, 12(5): 785-793.
冯昌,廖士中. 大规模核方法的随机假设空间方法[J]. 计算机科学与探索, 2018, 12(5): 785-793.
0 / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://fcst.ceaj.org/EN/10.3778/j.issn.1673-9418.1705027
http://fcst.ceaj.org/EN/Y2018/V12/I5/785
/D:/magtech/JO/Jwk3_kxyts/WEB-INF/classes/