检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:卢超 杨翠丽 乔俊飞 LU Chao;YANG Cui-li;QIAO Jun-fei(Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China;Beijing Key Laboratory of Computational Intelligence and Intelligent System, Beijing 100124, China)
机构地区:[1]北京工业大学信息学部,北京100124 [2]计算智能与智能系统北京市重点实验室,北京100124
出 处:《控制与决策》2018年第6期1055-1061,共7页Control and Decision
基 金:国家自然科学基金重点项目(61533002);国家自然科学基金青年基金项目(61603012)
摘 要:针对模块化神经网络结构设计过程中子网络输出不能最优集成的问题,提出一种基于粒子群算法的动态模块化神经网络.首先,该网络采用数据密度辨识样本分布空间,并更新数据中心;然后,根据输入数据激活相应的子网络,利用PSO算法寻找子网络的最优网络贡献度,并依据贡献度计算子网络的输出权值;最后优化模块化神经网络的集成输出.通过对非线性函数和时变系统的逼近实验,验证了集成网络中子网络数目可以根据任务动态调整,网络输出的集成权值能够通过PSO算法寻找到最优值,并且训练精度和自适应能力较其他算法均有一定的提高.In order to solve the problem the sub-network output can not be optimally integrated in a modular neural network(MNN), this paper proposeds a dynamic MNN based on the particle swarm optimization(PSO) algorithm. Firstly,the distribution of samples can be identified and the center of datas can be updated by computing the data density. Secondly,the corresponding sub-networks are activated according to the input datas, then the output weights are calculated by the best contribution degrees which are computed via the PSO algorithm. Finally, a dynamic neural network is completed to optimize the integrated output of the MNN. Based on the approximating experiments of the non-linear function and timeseries prediction, it is proved that the number of sub-networks can be adjusted dynamically, and the integrated weights of the neural network can be optimized by using the PSO algorithm. Comparisons with other algorithms demonstrate that the proposed method is more effective in terms of the accuracy and adaptive ability.
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.43