结合批归一化的直通卷积神经网络图像分类算法  被引量:25

Straight Convolutional Neural Networks Algorithm Based on Batch Normalization for Image Classification

在线阅读下载全文

作  者:朱威[1] 屈景怡[1] 吴仁彪[1] Zhu Wei;Qu Jingyi;Wu Renbiao(Tianjin Key Laboratory for Advanced Signal Processing, Civil Aviation University of China, Tianjin 300300)

机构地区:[1]中国民航大学天津市智能信号与图像处理重点实验室,天津300300

出  处:《计算机辅助设计与图形学学报》2017年第9期1650-1657,共8页Journal of Computer-Aided Design & Computer Graphics

基  金:国家自然科学基金青年科学基金(11402294);天津市智能信号与图像处理重点实验室开放基金(2015AFS03);中国民航大学第六期波音基金(20160159209)

摘  要:为解决深度卷积神经网络由于梯度消失而导致训练困难的问题,提出一种基于批归一化的直通卷积神经网络算法.首先对网络所有卷积层的激活值进行批归一化处理,然后利用可学习的重构参数对归一化后的数据进行还原,最后对重构参数进行训练.在CIFAR-10,CIFAR-100和MNIST这3个标准图像数据集上进行实验的结果表明,文中算法分别取得了94.53%,73.40%和99.74%的分类准确率,明显优于其他深度神经网络算法;该算法能够有效地克服传统卷积神经网络中梯度消失的问题.In order to solve the problem that the deep convolutional neural networks are difficult to be trained due to vanishing gradients,a straight convolutional neural networks algorithm based on improving the methodology of batch normalization is proposed.Firstly,the activations of convolutional layers are normalized.Secondly,the normalized activations are restored by reconstructing parameters.Finally,the proposed algorithm is used to train reconstruction parameters.On three image datasets CIFAR-10,CIFAR-100and MNIST,the classification accuracies of SCNN can archive94.53%,73.40%and99.74%respectively,which significantly outperforms other deep neural networks algorithms.The proposed algorithm can effectively overcome the problem of vanishing gradients in traditional convolutional neural networks.

关 键 词:图像分类 深度学习 直通卷积神经网络 批归一化 梯度消失 

分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象