检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]浙江大学工业控制技术国家重点实验室,浙江杭州310027
出 处:《光谱学与光谱分析》2008年第12期2847-2850,共4页Spectroscopy and Spectral Analysis
基 金:国家"863"计划项目(2006AA04Z169);浙江省科技计划项目(2005C311042)资助
摘 要:近红外光谱分析技术是近年来发展起来的一种快速检测技术。为了提高近红外光谱定量分析的精度,文章首先用支持向量机法对测试样本进行分类,然后选用与待测样本性质相近的同类部分校正集样品建模来预测待测属性值。为了克服分类错误样本的影响,提出了一种新的混合PLS算法(称为H_PLS法)。该算法由基于分类的局部PLS法(称为C_PLS法)和基于全部训练样本集的局部PLS法(称为D_PLS法)组成,通过比较C_PLS法和D_PLS法的输出,计算测试样本的待测属性值。针对一批汽油样本的实验结果表明,在分类完全正确时C_PLS法的预测精度高于D_PLS法,然而存在分类错误时C_PLS法的预测精度将会显著下降。H_PLS法结合了C_PLS法和D_PLS法的优点,即使存在分类错误样本,使用该方法也可以将复相关系数从D_PLS法的0.973 4和C_PLS法的0.965 6提高到0.985 8。As a rapid analytical technology,near-infrared(NIR) spectroscopy has been developed fast in recent years.To improve the accuracy of near-infrared spectral quantitative analysis,the present paper first classifies a testing sample by a support vector machine classifier and selects some similar training samples of the same type to build the calibration model,than predicts the property of the testing sample.To avoid the negative influence of classification failure,a new hybrid algorithm(called H_PLS) was proposed.This algorithm consists of a local PLS model based on the same-type training samples(called C_PLS) and a local PLS model based on the total training samples(called D_PLS).H_PLS calculates the predictive value of the property for the testing sample by comparing the outputs of the two models.For a set of gasoline samples,experimental results show that the prediction accuracy of C_PLS is higher than that of D_PLS if there are no classification errors,otherwise the prediction accuracy of C_PLS will drop obviously.The novel proposed algorithm(H_PLS) combines the advantages of C_PLS and D_PLS.Using H_PLS,can increase from 0.973 4 of D_PLS and 0.965 6 of C_PLS to 0.985 8 even though there are classification errors.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.148.210.23