检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王红霞[1] 周家奇 辜承昊 林泓[1] WANG Hong-xia;ZHOU Jia-qi;GU Cheng-hao;LIN Hong(School of Computer Science and Technology,Wuhan University of Technology,Wuhan 430063,China)
机构地区:[1]武汉理工大学计算机科学与技术学院
出 处:《浙江大学学报(工学版)》2019年第7期1363-1373,共11页Journal of Zhejiang University:Engineering Science
摘 要:为了提高图像分类效果,针对卷积神经网络中常用激活函数relu在x负半轴的导数恒为零,导致训练过程中容易造成神经元"坏死"以及现有组合激活函数relu-softplus在模型收敛情况下学习率过小导致收敛速度慢的问题,提出新的组合激活函数relu-softsign.分析激活函数在训练过程中的作用,给出激活函数在设计时需要考虑的要点;根据这些要点,将relu和softsign函数于x轴正、负半轴进行分段组合,使其x负半轴导数不再恒为零;分别在MNIST、PI100、CIFAR-100和Caltech256数据集上,与单一的激活函数和relu-softplus组合激活函数进行对比实验.实验结果表明,使用relu-softsign组合激活函数提高了模型分类准确率,简单有效地缓解了神经元不可逆"坏死"现象;加快了模型的收敛速度,在复杂数据集上该组合函数的收敛性能更好.A new combinatorial activation function called relu-softsign was proposed aiming at the problem that the derivative of the commonly used activation function relu in the convolutional neural network is constant to zero at the x negative axis,which makes it easy to cause neuron necrosis during training,and the existing combinatorial activation function relu-softplus can only use the small learning rate in the case of model convergence,which leads to slow convergence.The image classification effect was improved.The role of the activation function during training was analyzed,and the key points that need to be considered in the design of the activation function were given.The relu and softsign functions were combined piecewise in the positive and negative semi axis of the x axis according to these points,so that the derivative of x negative semi axis was no longer constant to zero.Then comparision with the single activation function and relu-softplus combination activation function was conducted on the MNIST,PI100,CIFAR-100 and Caltech256 datasets.The experimental results show that the combinatorial activation function relu-softsign improves the model classification accuracy,simply and effectively mitigates the irreversible "necrosis" phenomenon of neurons.The convergence speed of the model is accelerated,especially on complex data sets.
关 键 词:图像分类 卷积神经网络 激活函数 relu 神经元坏死 组合激活函数
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.227