一种带修剪的增量极速学习模糊神经网络  被引量:1

Pruned Incremental Extreme Leaning Machine Fuzzy Neural Network

在线阅读下载全文

作  者:胡蓉[1,2] 徐蔚鸿[1,3] 

机构地区:[1]南京理工大学计算机科学与技术学院,南京210000 [2]长沙航空职业技术学院,长沙410014 [3]长沙理工大学计算机与通信工程学院,长沙410000

出  处:《计算机科学》2013年第5期279-282,共4页Computer Science

基  金:国家自然科学基金(61163040);湖南省教育厅科研项目(11C0009)资助

摘  要:由Huang提出的extreme learning machine(ELM)批量学习算法在获得与其他算法相当的性能的同时显示出了极快的学习速度。为了实现在线增量学习,扩展了ELM方法,提出了一种带修剪的极速学习模糊神经网络。首先随机产生模糊神经网络前件参数和规则数量,然后使用SVD将规则按照重要性能排序,再使用留一法leave-one-out(LOO)选出最佳的模糊规则数,最后分析计算模糊规则的后件参数。在学习过程中无须保存过去的数据,真正实现了增量学习。当新的数据到来时,无须重新训练网络。通过仿真实验对该方法与其他算法进行了验证和比较,结果表明,在获得与其他算法类似的性能的情况下,该算法能够获得更加简洁的结构。ELM for batch learning developed by Huang et al has been shown to be extremely fast with generalization performance better than other batch training methods. In order to lean online, a pruned incremental extreme leaning al- gorithm was developed for fuzzy neural network. Based on the previously proposed ELM method, we proposed a pruned incremental extreme leaning algorithm. First a set of simple antecedents and random values for the parameters of input membership functions were randomly generated. Then SVD was used to rank the fuzzy basis functions. Then the best number of fuzzy rules was selected by performing a fast computation of the Ieave-one-out validation error. Finally, the consequents parameters were determined analytically. During the procedure of learning, it is not need to memory the last date. It is a real incremental leaning method. When a new data arrives, it is not need to retrain the network. A compari- son was performed against well known neuro-fuzzy methods. It is shown that the method proposed is robust and com- petitive in terms of accuracy and speed.

关 键 词:极速学习机(ELM) 增量学习 模糊神经网络 径向基函数 

分 类 号:TP306.1[自动化与计算机技术—计算机系统结构]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象