检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
出 处:《计算机科学与探索》2016年第1期130-141,共12页Journal of Frontiers of Computer Science and Technology
基 金:国家自然科学基金;江苏省自然科学基金~~
摘 要:自组织联想记忆神经网络因其并行、容错及自我学习等优点而得到广泛应用,但现有主流模型在增量学习较大规模样本时,网络节点数可能无限增长,从而给实际应用带来不可承受的内存及计算开销。针对该问题,提出了一种容量约束的自组织增量联想记忆模型。以网络节点数为先决控制参数,结合设计新的节点间自竞争学习策略,新模型可满足大规模样本的增量式学习需求,并能以较低的计算容量取得较高的联想记忆性能。理论分析表明了新模型的正确性与有效性,实验分析同时显示了新模型可有效控制计算容量,提升增量样本学习效率,并获得较高的联想记忆性能,从而能更好地满足现实应用需求。Due to the advantages of self-organizing neural network like parallelism, fault freedom and self-learning,it has been widely used all over the place. However, in traditional associative memory neural networks, the number of network nodes will unlimitedly grow when they incrementally learning more and more samples, which inevitably leads to an unaffordable overhead of computation and storage. To solve this problem, this paper proposes a self-organizing incremental associative memory model under capacity constraint. By limiting the number of network nodes and introducing a self- competition strategy between network nodes, new model is capable of incrementally learning large-scale samples and can gain equivalent associative memory performance only requiring lower computing demand.The reasonability of model is proved by theoretical analysis. Moreover, the experimental results demonstrate that new model can effectively control computing consumption, improve the efficiency of incrementally learning new samples, and obtain comparative associative memory performance, which may preferably satisfy the demands of many practical applications.
分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.116.239.11