核判别随机近邻嵌入分析方法  被引量:5

Kernel-Based Discriminative Stochastic Neighbor Embedding Analysis

在线阅读下载全文

作  者:王万良[1] 邱虹[1] 黄琼芳[1] 郑建炜[1] 

机构地区:[1]浙江工业大学计算机科学与技术学院杭州310023

出  处:《计算机辅助设计与图形学学报》2014年第4期623-631,共9页Journal of Computer-Aided Design & Computer Graphics

基  金:国家“十二五”科技支撑计划(2012BAD10B01);国家自然科学基金(61070043);浙江省自然科学基金(LQ12F03011)

摘  要:为了有效地解决非线性特征提取中存在的鉴别效率和样本外问题,最大限度地保持观测信息,并进一步提高相关方法的降维性能,将核学习的方法应用到判别随机近邻嵌入分析方法中,提出一种核判别随机近邻嵌入分析方法.通过引入核函数,将原空间中的样本映射到高维核空间中,构建了用于反映同类和异类数据间相似度的联合概率表达式;在此基础上,引入线性投影矩阵生成对应子空间数据;最后在类内Kullback-Leiber(KL)散度最小和类间KL散度最大的准则下建立目标泛函.该方法突出了异类样本间的特征差异,使样本变得线性可分,从而提高了分类性能.在COIL-20图像库和ORL,Yale经典人脸库上进行实验,验证了文中方法的分类鉴别能力.In order to improve the discriminative efficiency and solve the out-of-sample problem which exist in non-linear feature extraction, a kernel-based discriminative stochastic neighbor embedding analysis (KDSNE) method is proposed by imposing the kernel trick, which furthest maintains the observation information and effectively improves the performance of dimensionality reduction. Based on DSNE, the proposed method skillfully introduces kernel function and maps the data into a high- dimensional feature space, then it selects the joint probability to model the pairwise similarities of input samples and uses a linear projection matrix to get low-dimensional representations. Moreover, KDSNE chooses the Kullback-Leiber divergence to quantify the proximity of two probability distributions to build the penalty function. KDSNE outstands the feature differences between inter- class samples and makes the samples linear separable so as to improve the classification performance. Experimental results on COIL-20, ORL and Yale databases show the discriminative performance of the method.

关 键 词:判别随机近邻嵌入 基于核函数的方法 数据可视化 非线性特征提取 

分 类 号:TP301.6[自动化与计算机技术—计算机系统结构]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象