网络结构增长的极端学习机算法  被引量:4

Incremental constructive extreme learning machine

在线阅读下载全文

作  者:李凡军[1,2] 乔俊飞[1] 韩红桂[1] 

机构地区:[1]北京工业大学电子信息与控制工程学院,北京100124 [2]济南大学数学科学学院,山东济南250022

出  处:《控制理论与应用》2014年第5期638-643,共6页Control Theory & Applications

基  金:国家自然科学基金资助项目(61034008;61203099;61225016);北京市自然科学基金资助项目(4122006);教育部博士点新教师基金项目(20121103120020)

摘  要:针对极端学习机(extreme learning machine,ELM)结构设计问题,基于隐含层激活函数及其导函数提出一种前向神经网络结构增长算法.首先以Sigmoid函数为例给出了一类基函数的派生特性:导函数可以由其原函数表示.其次,利用这种派生特性提出了ELM结构设计方法,该方法自动生成双隐含层前向神经网络,其第1隐含层的结点随机逐一生成.第2隐含层的输出由第1隐含层新添结点的激活函数及其导函数确定,输出层权值由最小二乘法分析获得.最后给出了所提算法收敛性及稳定性的理论证明.对非线性系统辨识及双螺旋分类问题的仿真结果证明了所提算法的有效性.Focusing on the problem of architectural design of extreme learning machine (ELM), we propose a novel constructive algorithm by the activation function and its derivatives. Firstly, taking the Sigmoid function as an example, we give in detail the derived characteristics for a class of base functions: derivative functions can be expressed by their primitive functions. By making use of these derived characteristics, we propose a method to design the structure of ELM, which automatically generate feedforward neural networks with double hidden layers. The new units in the first hidden layer are generated randomly one by one; then, the outputs of the second hidden layer (derivation) are calculated by the activation function of the new node in the first layer and its derivatives. The weights of the output layer are calculated analytically by the least squares method. Finally, the analysis of convergence and stability are presented. The effectiveness of the proposed method is demonstrated by simulation results on nonlinear system identification and two-spiral classification problem.

关 键 词:前向神经网络 极端学习机 导数 结构设计 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象