Estimating the Support of a High-Dimensional Distribution

1、Generally, they can be characterized as estimating functions of the data, which reveal something interesting about the underlying distributions. For instance, kernel principal component analysis (PCA) can be characterized as computing functions that on the training data produce unit variance outputs while having minimum norm in feature space (Sch¨olkopf, Smola, & M ¨uller, 1999). Another kernel-based unsupervised learning technique, regularized principal manifolds (Smola, Mika, Sch¨olkopf, & Williamson, in press), com-putes functions that give a mapping onto a lower-dimensional manifold minimizing a regularized quantization error. Clustering algorithms are fur-ther examples of unsupervised learning techniques that can be kernelized (Sch¨olkopf, Smola, & M ¨uller, 1999).

核方法可被特征化为数据的估计函数,其揭示了数据潜在的分布。例如,给予核的PCA方法可被特征化为计算函数,该函数在训练集上产出特征空间中最小范数的单位方差输出。另一种基于核的无监督学习方法,regularized principal manifolds,这种计算函数通过最小化正则量化误差给出了原始数据到低维manifold的映射。聚类也是一种可被核化的无监督学习方法。

2、Section2 Previous Work

挖坑(知识有限,不能理解)

3、OCSVM

挖坑,SMO优化及改进

0%