检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张苏沛 刘军[1,2] 肖澳文[1,2] 杜壮 ZHANG Supei;LIU Jun;XIAO Aowen;DU Zhuang(Hubei Key Laboratory of Intelligent Robot(Wuhan Institute of Technology),Wuhan 430205,China;School of Computer Science&Engineering,Wuhan Institute of Technology,Wuhan 430205,China)
机构地区:[1]智能机器人湖北省重点实验室(武汉工程大学),湖北武汉430205 [2]武汉工程大学计算机科学与工程学院,湖北武汉430205
出 处:《武汉工程大学学报》2019年第1期89-92,共4页Journal of Wuhan Institute of Technology
基 金:智能机器人湖北省重点实验室开放基金(HBIR 201802);武汉工程大学第十届研究生教育创新基金
摘 要:针对传统验证码识别受字符分割限制的问题,将卷积神经网络应用到验证码的特征分析和识别中。使用验证码图像整体作为输入,对传统的LeNet-5的网络结构进行改进,构建一种端到端的卷积神经网络对图像由低级到高级逐层提取图像特征,选取ReLU作为激活函数,实现对验证码的识别。实验过程中设置对照组,研究不同因素对识别准确率的影响。测试结果显示,该模型能够进行端到端的识别,避免了字符分割方法流程过多导致的不足,在测试集上达到99%的识别率。结果表明训练次数的增加以及学习率的优化有助于提高卷积神经网络的准确率。Aiming at the limitations of character segmentation in traditional completely automated public turingtest to tell computers and humans apart(CAPTCHA)recognition we proposed an end-to-end convolutionalneural network to characterize and identify CAPTCHAs.Firstly,a whole CAPTCHA image was used as aninput,and then the convolutional neural network based on LeNet-5 was constructed to extract image featureslayer by layer from low-level to high-level.Finally,the ReLU function was selected as activation function toperform recognition task of CAPTCHA image.To study the effect of different factors on the recognitionaccuracy,a control group was provided in the experiments.The testing results show that the proposed methosrealized the end-to-end recognition,thus avoiding the insufficiency caused by too many processes of charactersegmentation method and achieving 99%recognition rate on the test set.It is found that the increase of trainingtimes and the optimization of learning rate could improve the accuracy of convolutional neural network.
分 类 号:TP317.4[自动化与计算机技术—计算机软件与理论]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.42