检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:祝恩[1] 殷建平[1] 胡春风[1] 张国敏[1]
机构地区:[1]国防科技大学计算机学院,湖南长沙410073
出 处:《同济大学学报(自然科学版)》2007年第5期669-674,共6页Journal of Tongji University:Natural Science
基 金:国家自然科学基金资助项目(6060301560373023)
摘 要:提出了一种系统化的指纹纹路方向计算和指纹图像分割方法.该方法用神经网络对基于梯度方法计算的纹路方向的正确性进行学习,经过训练的神经网络可以区分正确纹路方向和错误纹路方向,从而使错误的纹路方向可以根据周围正确的纹路方向进行纠正;同时,可以根据经过训练的网络的方向正确性计算结果对指纹图像进行初步分割.此外,提出了对初步分割结果进行二次分割以分离指纹图像中残留纹路的方法.将该方法与Neurotech-nologija Ltd在2004年发布的VeriFinger 4.2进行特征提取比较,结果表明,该方法有效地提高了特征提取的正确率.A scheme is proposed for systematically estimating fingerprint ridge orientation and segmenting fingerprint image by means of evaluating the correctness of the ridge orientation on the basis of neural network. The neural network was used to learn the correctness of the estimated orientation by gradient-based method. The trained network was able to distinguish correct and incorrect ridge orientations, and as a consequence, the falsely estimated ridge orientation of a local image block could be corrected by using the around blocks of which orientations were correctly estimated. A coarse segmentation could also be done on the basis of trained neural network by taking the blocks of correctly estimated orientation as foreground and the blocks of incorrectly estimated orientation as background. In addition, following the steps of estimating ridge orientation correctness, a secondary segmentation method is proposed to segment the remaining ridges which are the afterimage of the previously scanned fingers. The proposed scheme served for minutiae detection and was compared with VeriFinger 4.2 published by Neurotechnologija Ltd. in 2004, and the comparison result shows that the proposed scheme leads to an improved accuracy of minutiae detection.
关 键 词:纹路方向正确性 指纹分割 残留纹路 二次分割 分割修正
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28