检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]燕山大学 通信电子工程系,秦皇岛066004
出 处:《信号处理》2007年第2期161-164,共4页Journal of Signal Processing
基 金:国家自然科学基金资助(60272073)
摘 要:SVM是利用靠近边界的少数向量来构造最大间隔的分类超平面,当海量样本之间存在相互混迭时,支持向量数目急剧增加,导致训练难度增大。针对该问题,本文将结构风险最小化近邻分析与支持向量机相结合构成了一种新的SVM学习方法。它首先根据各个训练数据的类间最近邻距离利用结构风险最小化近邻分析选择训练子集;在选择的样本子空间内采用乘性规则直接求取Lagrange因子,而不是传统的二次优化方法;最后加入附加剩余样本进行交叉验证处理,直到算法满足收敛性准则。各种分类实验表明本文提出的算法具有良好的性能,特别是在训练样本庞大,支持向量数量较多的情况下,能够较大幅度的减少计算复杂度,提高分类速度。Support vector machine constructs an optimal hyperplane from a set of samples near the boundary, when some samples intermixed in another class seriously, It will result in the number of support vector increase greatly and the performance of training will become more difficult. To resolve this problem, firstly the structural risk minimization nearest neighbor analysis is introduced to select valid training subspaces. Then a reduced number of sample subspace is extracted for support vector training. In addition, instead of the traditional quadratic programming, muhiplicative update is used to solve Lagrange multiplier in optimization the solution of support vector. The samples of rest are used for cross validating till the algorithm is convergence. Experimental results demonstrate that this method has better performance and overcome the flaw of standard SVM. This algorithm could greatly reduce the computational load and increase the speed of training, especially in the case of large number of training sample.
关 键 词:结构风险最小原理 支持向量机 核函数 乘性规则 最近邻分类器
分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.131.93.117