检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:鲁海枰 孙永荣[1] 赵伟[1] 张怡 LU Haiping;SUN Yongrong;ZHAO Wei;ZHANG Yi(College of Automation Engineering,Nanjing University of Aeronautics and Astronautics,Nanjing 211106,China)
机构地区:[1]南京航空航天大学自动化学院,江苏南京211106
出 处:《现代电子技术》2022年第11期41-45,共5页Modern Electronics Technique
基 金:国家自然科学基金(61973160)资助项目。
摘 要:针对不同种类机场跑道环境下的小目标检测问题,提出一种基于多尺度融合超复数傅里叶变换的视觉显著性检测算法。利用二维Log⁃Gabor滤波器模仿人类视觉感受野提取经快速导向滤波算法后的预处理图中的颜色、亮度、纹理方向特征;根据所得特征构造各尺度下的超复数图像,并求其快速傅里叶变换相位谱再进行滤波,通过反变换后进行多尺度归一化获取视觉显著图;最终利用自适应阈值分割提取前景任务目标,实现机场跑道异物目标检测任务。实验结果表明,该算法相较于传统频域显著性算法检测准确率更高,虚警率更低,在多类机场跑道路面场景检测中能取得较好的检测效果。In allusion to the small object detection in different kinds of airport runway environments,a visual saliency detection algorithm based on multi⁃scale fusion hypercomplex Fourier transform is proposed.The 2D Log⁃Gabor filter is used to imitate the human visual receptive field to extract the features of colour,brightness and texture direction in the pre⁃processed images which has been processed by fast guided filtering algorithm.According to the obtained features,the hypercomplex images at various scales are constructed,and the fast Fourier transform(FFT)spectrum of the images is deduced and filtered.The visual saliency map is obtained by multi⁃scale normalization after the inverse transformation.The foreground task target is extracted by adaptive threshold segmentation.On the basis of the above steps,the airport runway FOD is achieved,finally.The experimental results show that the algorithm has higher detection accuracy and lower false alarm rate in comparison with the traditional frequency domain algorithms,and can achieve better detection results when it is applied to the scene detection of various airport runways.
关 键 词:机场跑道异物检测 视觉显著性 检测算法 图像预处理 超复数图像 目标提取
分 类 号:TN911.73-34[电子电信—通信与信息系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.229