检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王平[1,2] 田华阁[1] 田学民[1] 黄德先[2]
机构地区:[1]中国石油大学(华东)信息与控制工程学院,山东东营257061 [2]清华大学自动化系,北京100084
出 处:《化工学报》2010年第8期2040-2045,共6页CIESC Journal
基 金:国家高技术研究发展计划项目(2007AA04Z193);中国石油大学(华东)研究生创新基金项目(Z10-09)~~
摘 要:训练样本的数量与质量对于过程建模至关重要,在很大程度上影响所建模型的质量。基于增量式支持向量回归(SVR)学习算法,提出一种在线自适应建模方法以实现有选择地添加和删除训练样本。该方法利用SVR模型的KKT条件选择出那些包含足够多新信息的样本进行增量学习,能够在保证模型泛化能力的同时降低模型更新频率。另外,为快速准确地跟踪过程特性的变化,将通过评价当前模型对新增训练样本的学习能力来决定是否需要删除旧样本。当需要删除样本时,基于样本间的相似度,选择淘汰与当前过程特性差别最大的旧样本。将该方法用于建立工业聚丙烯熔融指数预报模型,结果表明,与其他方法相比,获得的预测模型具有更好的泛化性能,且模型更新频率明显降低,能有效地适应工况的变化。Considering the performance of a predictive model is heavily depended on its training samples,a new on-line adaptive modeling approach based on incremental support vector regression (SVR) is presented.When a new sample arrives,it is firstly checked by the Karush-Kuhn-Tucker(KKT) condition of established model,only those which contain sufficient new information can be introduced into the training sample set.In this way,the model generalization ability will be maintained while the update frequency can be reduced.If the new sample cannot be described by the established model and,therefore,has a large prediction error,the model must be updated and the useless sample should be deleted from the model,to adapt the process characteristics.In this case,the useless sample,while not the oldest one,is selectively deleted from the model based on the similarity between samples.The proposed method is illustrated through the application to an industrial polypropylene unit to predict its melt index.The results show that,compared with other methods,the proposed method exhibits good generalization ability while the update frequency significantly lower,and therefore trace the process characteristics effectively.
分 类 号:TP301.6[自动化与计算机技术—计算机系统结构]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.229