检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]郑州大学信息工程学院,河南郑州450052 [2]南开大学现代光学研究所,天津300071
出 处:《计算机应用》2007年第5期1214-1216,1219,共4页journal of Computer Applications
基 金:教育部留学回国人员科研启动基金资助项目;教育部博士点基金资助项目(20030055022);河南省杰出青年基金资助项目(512000400)
摘 要:提出了利用基于自适应训练及删剪算法的抽头延迟神经网络模型对股指这一非线性时间序列进行预测。首先采用基于递归最小方差的自适应学习算法对网络模型进行学习训练,由于该算法的学习步长能够自行调整,初始参数少,所以收敛速度很快;再利用删剪算法对学习后的网络结构进行删剪,优化网络的拓扑结构,降低网络的计算复杂度,提高网络的泛化能力;然后对优化后的网络进行再学习,使优化后的网络具有最佳参数;最后利用优化后的网络对未来的股指(测试样本)进行预测。仿真实验表明,与删剪前的网络结构相比,优化后的网络结构不但降低了计算复杂度而且提高了预测精度,运算复杂度降低到原来的0.055 6,预测均方误差达到8.796 1×10-5。A tapped delay neural network based on adaptive learning and pruning algorithm was proposed to predict the nonlinear time serial stock indexes. Firstly adaptive learning algorithm based on recursive least square was employed to train the tapped delay neural network. Because this algorithm's learning step can be auto-conditioning and the number of its tunable parameters is small, the convergence rate is fast. Secondly the architecture of neural network which has been trained was optimized by utilizing pruning algorithm to reduce the computational complexity and enhance network's generalization, And then the optimized network was retrained so that it had the optimum parameters. At last the test samples were predicted by the ultimate network. The simulation and comparison show that this optimized neuron network can not only reduce the calculating complexity greatly, but also improve the prediction precision. The computational complexity was reduced to 0. 055 6 and mean square error of test samples reached 8.7961 ×10^-5.
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.194