检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:V.Arulalan Dhananjay Kumar
机构地区:[1]Department of Information Technology,Anna University,MIT Campus,Chennai,600044,India
出 处:《Computer Systems Science & Engineering》2023年第2期1703-1717,共15页计算机系统科学与工程(英文)
摘 要:Object detection and classification are the trending research topics in thefield of computer vision because of their applications like visual surveillance.However,the vision-based objects detection and classification methods still suffer from detecting smaller objects and dense objects in the complex dynamic envir-onment with high accuracy and precision.The present paper proposes a novel enhanced method to detect and classify objects using Hyperbolic Tangent based You Only Look Once V4 with a Modified Manta-Ray Foraging Optimization-based Convolution Neural Network.Initially,in the pre-processing,the video data was converted into image sequences and Polynomial Adaptive Edge was applied to preserve the Algorithm method for image resizing and noise removal.The noiseless resized image sequences contrast was enhanced using Contrast Limited Adaptive Edge Preserving Algorithm.And,with the contrast-enhanced image sequences,the Hyperbolic Tangent based You Only Look Once V4 was trained for object detection.Additionally,to detect smaller objects with high accuracy,Grasp configuration was observed for every detected object.Finally,the Modified Manta-Ray Foraging Optimization-based Convolution Neural Network method was carried out for the detection and the classification of objects.Comparative experiments were conducted on various benchmark datasets and methods that showed improved accurate detection and classification results.
关 键 词:Object detection hyperbolic tangent YOLO manta-ray foraging object classification
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.31