检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:郑睿[1,2] 余童 程龙阅 Zheng Rui;Yu Tong;Cheng Longyue(College of Physics and Electronic Information,Anhui Normal University,Wuhu 241002,China;Anhui Province Engineering Laboratory of Intelligent Robot′s Information Fusion and Control,Wuhu 241000,China)
机构地区:[1]安徽师范大学物理与电子信息学院,安徽芜湖241002 [2]安徽省智能机器人信息融合与控制工程实验室,安徽芜湖241000
出 处:《信息技术与网络安全》2020年第6期31-37,43,共8页Information Technology and Network Security
基 金:国家自然科学基金(61074162,61503003);安徽省自然科学基金面上项目(1908085MF216)。
摘 要:针对深度学习在对外形类似物体的识别上存在着识别精度低、耗时长等问题,提出基于改进的LeNet-5的识别方法。在传统LeNet-5网络基础上,将卷积层变为双层非对称卷积使网络有更好的特征提取能力;通过批量归一化提高网络泛化能力;采用全局平均池化替代原Flatten层,用于克服传统全连接层参数多、耗时长的缺点;通过对训练集进行增广增加训练样本。实验结果表明,改进LeNet-5网络的训练精度达到91%,识别形状类似物体的精度为87%,且能在较少迭代次数内收敛,这些指标均显著优于原网络。Aiming at the problems of low recognition accuracy and long time-consuming in the recognition of similar shape objects by deep learning, a recognition method based on improved LeNet-5 is proposed. Based on the traditional LeNet-5 network, changing the convolutional layer into a double-layer asymmetric convolution makes the network have better feature extraction capabilities;the generalization ability of the network is improved by batch normalization;the original Flatten layer is replaced by global average pooling, which is used to overcome the shortcomings of the traditional fully-connected layer with many parameters and long time-consuming;the training sample is increased by augmenting the training set. Experimental results show that the training accuracy of the improved LeNet-5 network reaches 91 %, the accuracy of identifying objects with similar shapes is 87 %, and it can converge within a small number of iterations. These indicators are significantly better than the original network.
关 键 词:LeNet-5网络 图像识别 非对称卷积 批量归一化 最大平均池化
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15