卷积神经网络中SPReLU激活函数的优化研究  被引量:10

Research on Optimization of SPReLU Activation Function in Convolutional Neural Network

在线阅读下载全文

作  者:吴婷婷 许晓东[1] 吴云龙[1] WU Tingting;XU Xiaodong;WU Yunlong(School of Computer Science and Communication Engineering,Jiangsu University,Zhenjiang 212013)

机构地区:[1]江苏大学计算机科学与通信工程学院,镇江212013

出  处:《计算机与数字工程》2021年第8期1637-1641,共5页Computer & Digital Engineering

摘  要:由于激活函数本身的特性,使得卷积神经网络出现了梯度消失、神经元死亡、均值偏移、稀疏表达能力差等问题,针对这些问题,将“S”型激活函数和ReLU系激活函数进行了对比,分别讨论其优点和不足,并结合ReLU、PReLU和Softplus三种激活函数优点,提出了一种新型激活函数SPReLU。实验结果表明,SPReLU函数在性能上优于其他激活函数,收敛速度快,能有效降低训练误差,缓解梯度消失和神经元死亡等问题,能够有效地提高文本分类模型的准确性。Because of the characteristics of activation function,problems such as gradient disappearance,neuron death,mean deviation and poor sparse expression ability occur in the convolutional neural network.In view of these problems,the"S"type activation function and the ReLU system activation function are compared and their advantages and disadvantages are discussed.Then,a new activation function SPReLU is proposed by combining the three activation functions ReLU,PReLU and Softplus.Experimental results show that SPReLU function is superior to other activation functions in performance,with fast convergence speed,effective reduction of training error,solution of gradient disappearance and neuron death,and effective improvement of accuracy of text classification model.

关 键 词:卷积神经网络 激活函数 梯度消失 神经元死亡 ReLU 

分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象