检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]南开大学物理系,天津300071
出 处:《南开大学学报(自然科学版)》1997年第2期31-35,47,共6页Acta Scientiarum Naturalium Universitatis Nankaiensis
基 金:国家自然科学基金;"非线性攀登项目"资助
摘 要:本文把人工神经网络的互连权视为广义的自旋变量,网络的学习问题看作互连权空间的优化问题.进而将通常在人工神经网络组态空间的连续时间动力学方程组推广到人工神经网络的互连权空间,并在方程组中引入类似Metropolis的MonteCarlo算法机制改进此方程组以提高寻优能力,提出了一个人工神经网络的演化方程学习算法.该算法在很大程度上摆脱了局域极值的束缚,得到了最优或接近最优的互连权.本文集中研究单层反馈人工神经网络,将互连权组态空间的能量函数取为二次函数,研究了由该学习算法得到的网络的最大存储容量ac.结果表明,当神经元数较少时,得到了接近赝逆模型的结果,并且随神经元数的增大存储容量缓慢降低.n this paper the synaptic couplings of artificial neural networks are viewed as a general spin variable, considering the learning of the network as an optimal problem in synapticspace. As a result, the continuous-time dynamical coupled equations of pattern space ofneural network can be extended to synaptic space. An evolution equation learning algorithm is proposed with introducing a scheme similar to the Monte Carlo method proposedby Metropolis into these equations to improve the optimization ability. With this algorithm, the system can escape from the difficult of local extrema of the energy function andreach an optimal solution or its good approximations. This paper concentrate on singlelayered recurrent artificial neural networks. We use a quadric energy function in synapticcoupling space. We have studied the critical storage capacity ac of the networks whosesynaptic couplings are obtained with our algorithm. Computer simulation show that ac approximates to pseudo-inverse models when neuron number N is small and decreases slowlywith the increasing of N.
分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28