检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]北京工业大学电子信息与控制工程学院,北京100124 [2]济南大学数学科学学院,山东济南250022
出 处:《控制理论与应用》2014年第5期638-643,共6页Control Theory & Applications
基 金:国家自然科学基金资助项目(61034008;61203099;61225016);北京市自然科学基金资助项目(4122006);教育部博士点新教师基金项目(20121103120020)
摘 要:针对极端学习机(extreme learning machine,ELM)结构设计问题,基于隐含层激活函数及其导函数提出一种前向神经网络结构增长算法.首先以Sigmoid函数为例给出了一类基函数的派生特性:导函数可以由其原函数表示.其次,利用这种派生特性提出了ELM结构设计方法,该方法自动生成双隐含层前向神经网络,其第1隐含层的结点随机逐一生成.第2隐含层的输出由第1隐含层新添结点的激活函数及其导函数确定,输出层权值由最小二乘法分析获得.最后给出了所提算法收敛性及稳定性的理论证明.对非线性系统辨识及双螺旋分类问题的仿真结果证明了所提算法的有效性.Focusing on the problem of architectural design of extreme learning machine (ELM), we propose a novel constructive algorithm by the activation function and its derivatives. Firstly, taking the Sigmoid function as an example, we give in detail the derived characteristics for a class of base functions: derivative functions can be expressed by their primitive functions. By making use of these derived characteristics, we propose a method to design the structure of ELM, which automatically generate feedforward neural networks with double hidden layers. The new units in the first hidden layer are generated randomly one by one; then, the outputs of the second hidden layer (derivation) are calculated by the activation function of the new node in the first layer and its derivatives. The weights of the output layer are calculated analytically by the least squares method. Finally, the analysis of convergence and stability are presented. The effectiveness of the proposed method is demonstrated by simulation results on nonlinear system identification and two-spiral classification problem.
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222