容量约束的自组织增量联想记忆模型  被引量:1

Self-Organizing Incremental Associative Memory Model under Capacity Constraint

在线阅读下载全文

作  者:孙桃[1] 谢振平[1] 王士同[1] 刘渊[1] 

机构地区:[1]江南大学数字媒体学院,江苏无锡214122

出  处:《计算机科学与探索》2016年第1期130-141,共12页Journal of Frontiers of Computer Science and Technology

基  金:国家自然科学基金;江苏省自然科学基金~~

摘  要:自组织联想记忆神经网络因其并行、容错及自我学习等优点而得到广泛应用,但现有主流模型在增量学习较大规模样本时,网络节点数可能无限增长,从而给实际应用带来不可承受的内存及计算开销。针对该问题,提出了一种容量约束的自组织增量联想记忆模型。以网络节点数为先决控制参数,结合设计新的节点间自竞争学习策略,新模型可满足大规模样本的增量式学习需求,并能以较低的计算容量取得较高的联想记忆性能。理论分析表明了新模型的正确性与有效性,实验分析同时显示了新模型可有效控制计算容量,提升增量样本学习效率,并获得较高的联想记忆性能,从而能更好地满足现实应用需求。Due to the advantages of self-organizing neural network like parallelism, fault freedom and self-learning,it has been widely used all over the place. However, in traditional associative memory neural networks, the number of network nodes will unlimitedly grow when they incrementally learning more and more samples, which inevitably leads to an unaffordable overhead of computation and storage. To solve this problem, this paper proposes a self-organizing incremental associative memory model under capacity constraint. By limiting the number of network nodes and introducing a self- competition strategy between network nodes, new model is capable of incrementally learning large-scale samples and can gain equivalent associative memory performance only requiring lower computing demand.The reasonability of model is proved by theoretical analysis. Moreover, the experimental results demonstrate that new model can effectively control computing consumption, improve the efficiency of incrementally learning new samples, and obtain comparative associative memory performance, which may preferably satisfy the demands of many practical applications.

关 键 词:联想记忆 容量约束 增量学习 自组织 神经网络 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象