检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Raqinah Alrabiah Muhammad Hussain Hatim A.AboAlSamh
出 处:《Intelligent Automation & Soft Computing》2023年第3期2941-2962,共22页智能自动化与软计算(英文)
基 金:The authors are thankful to the Deanship of Scientific Research,King Saud University,Riyadh,Saudi Arabia for funding this work through the Research Group No.RGP-1439-067.
摘 要:The gender recognition problem has attracted the attention of the computer vision community due to its importance in many applications(e.g.,sur-veillance and human–computer interaction[HCI]).Images of varying levels of illumination,occlusion,and other factors are captured in uncontrolled environ-ments.Iris and facial recognition technology cannot be used on these images because iris texture is unclear in these instances,and faces may be covered by a scarf,hijab,or mask due to the COVID-19 pandemic.The periocular region is a reliable source of information because it features rich discriminative biometric features.However,most existing gender classification approaches have been designed based on hand-engineered features or validated in controlled environ-ments.Motivated by the superior performance of deep learning,we proposed a new method,PeriGender,inspired by the design principles of the ResNet and DenseNet models,that can classify gender using features from the periocular region.The proposed system utilizes a dense concept in a residual model.Through skip connections,it reuses features on different scales to strengthen dis-criminative features.Evaluations of the proposed system on challenging datasets indicated that it outperformed state-of-the-art methods.It achieved 87.37%,94.90%,94.14%,99.14%,and 95.17%accuracy on the GROUPS,UFPR-Periocular,Ethnic-Ocular,IMP,and UBIPr datasets,respectively,in the open-world(OW)protocol.It further achieved 97.57%and 93.20%accuracy for adult periocular images from the GROUPS dataset in the closed-world(CW)and OW protocols,respectively.The results showed that the middle region between the eyes plays a crucial role in the recognition of masculine features,and feminine features can be identified through the eyebrow,upper eyelids,and corners of the eyes.Furthermore,using a whole region without cropping enhances PeriGender’s learning capability,improving its understanding of both eyes’global structure without discontinuity.
关 键 词:Gender recognition periocular region deep learning convolutional neural network unconstrained environment
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.190.152.109