神经网络中的正交设计法研究  被引量:35

Orthogonal Method for Training Neural Networks

在线阅读下载全文

作  者:周毅[1] 徐柏龄[1] 

机构地区:[1]南京大学声学研究所,近代声学国家重点实验室,南京210093

出  处:《南京大学学报(自然科学版)》2001年第1期72-78,共7页Journal of Nanjing University(Natural Science)

基  金:国家自然科学基金资助!(69872014)

摘  要:提出用正交设计法选择前馈型神经网络训练样本的方法,并且对其基本原理进行了探讨.文中以三层BP网络为例,根据特定位级的正交表获取相应的训练样本集,利用该训练样本集对网络训练,然后用经过训练的网络对各取值域作离散预测.详细的对比计算表明:1)将正交设计法应用于神经网络学习训练样本的选取上是成功有效的;2)选用四位级或五位级的正交表来对网络样本做选择精度已经足够.另外,将网络隐层神经元个数对网络的影响与选用正交表不同位级对网络的影响的效果也作了比较.Although neural network has been established for several decades, and there have been numerous studies on this field, how to select the exact samples for training network is still a problem. Here an orthogonal method is presented in this paper, and the principle is described. Without losing universality, a three-layer BP network is ed. A one-dimensional function and a three-dimensional function are tested as examples. Each parameter of the function is covered with 3,4, and 5 levels according to the orthogonal charts. Following the orthogonal charts there loudly are 9, 16, and 25 network training samples for 3,4, and 5 levels respectively. Comparisons are dried out as follows: (i)network simulation with different level training samples, (ii)difference between high level sample selection and increasing the number of hidden neuron, and (iii)one group of 4 level samples and four groups of stochastic samples with the numbers of 18,20,22, and 24 selected from 4 level sample domain. From all the computations and comparisons we have reached the following conclusions: First, making use of the orthogonal method in the selection of nework training samples, an effective simulation can be obtained. Second, level 4 or level 5 is enough for network training sample selection. Third, changing the level of Sample selection has a better result than changing the number of hidden neuron.

关 键 词:正交设计法 神经网络 BP网络 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象