On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights  

在线阅读下载全文

作  者:Dansheng Yu Yunyou Qian Fengjun Li 

机构地区:[1]Department of Mathematics,Hangzhou Normal University,Hangzhou,Zhejiang 310036,China [2]School of Mathematics and Statistics,Ningxia University,Yinchuan,Ningxia 750021,China

出  处:《Analysis in Theory and Applications》2023年第1期93-104,共12页分析理论与应用(英文刊)

基  金:suppoorted by NSFC (No.12061055).

摘  要:Recently,Li[16]introduced three kinds of single-hidden layer feed-forward neural networks with optimized piecewise linear activation functions and fixed weights,and obtained the upper and lower bound estimations on the approximation accuracy of the FNNs,for continuous function defined on bounded intervals.In the present paper,we point out that there are some errors both in the definitions of the FNNs and in the proof of the upper estimations in[16].By using new methods,we also give right approximation rate estimations of the approximation by Li’s neural networks.

关 键 词:Approximation rate modulus of continuity modulus of smoothness neural network operators 

分 类 号:O174.21[理学—数学]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象