基于压缩K近邻边界向量的支持向量预抽取算法  被引量:3

Support vector pre-extraction algorithm based on compressed K-nearest neighbor boundary vector

在线阅读下载全文

作  者:王石 蒋宁宁 杨舒卉 WANG Shi;JIANG Ning-ning;YANG Shu-hui(Scientific Research and Academic Development Office,Naval Univ.of Engineering,Wuhan 430033,China;Network Office,Navy Command Automation Workstation,Beijing 100841,China;;Teaching and Research Support Center,Naval Univ.of Engineering,Wuhan 430033,China)

机构地区:[1]海军工程大学科研学术处,武汉430033 [2]海军指挥自动化工作站网络室,北京100841 [3]海军工程大学教研保障中心,武汉430033

出  处:《海军工程大学学报》2018年第6期74-79,共6页Journal of Naval University of Engineering

摘  要:针对支持向量机处理大数据量时存在的训练时间、内存空间消耗过大的问题,首先提出了一种压缩K-近邻边界向量的支持向量预抽取算法(CKNN),并进行了仿真。仿真结果表明:CKNN算法无论对于线性可分数据集,还是高度非线性的螺旋曲线数据集,均能在推广能力较小损失的前提下,大幅裁剪训练样本;然后,进一步采用5个UCI标准数据集仿真验证CKNN的有效性,结果表明:大数据量情况下效果更优,但代价是SVM推广能力略有损失。Large amounts of data processing by the support vector machine consume a lot of training time and memory space.To address this problem,a support vector pre-extraction algorithm(CKNN)is proposed,which is based on compressed K-nearest neighbor boundary vector.The simulation results show that,for the linear separable data set or highly nonlinear spiral curve data set,CKNN can greatly reduce the number of training samples under the premise of the small loss of SVM generalization performance.Then,a simulation test is carried out on five UCI standard database to verify the effectiveness of CKNN.The results show that the effect of CKNN is better for large amounts of data,but at the price of slight loss of SVM generalization performance.

关 键 词:支持向量机 边界向量 K-近邻 压缩近邻 

分 类 号:TP391.45[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象