检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张甜 许述文[1] 白晓惠 水鹏朗[1] ZHANG Tian;XU Shuwen;BAI Xiaohui;SHUI Penglang(National Key Laboratory of Radar Signal Processing,Xidian University,Xi′an Shaanxi 710071,China)
机构地区:[1]西安电子科技大学雷达信号处理全国重点实验室,陕西西安710071
出 处:《现代雷达》2024年第7期16-22,共7页Modern Radar
基 金:国家自然科学基金资助项目(62371382)。
摘 要:海面无人机(UAV)的检测属于海杂波背景下的小目标检测问题,多特征联合检测是解决此类问题的有效途径。针对已有的时频三特征检测方法在特征提取阶段计算复杂度过大、难以实现实时检测的问题,提出了一种基于快速时频图的海面无人机多特征检测方法。首先,对雷达复回波数据进行分段快速傅里叶变换,将计算得到的多普勒幅度谱沿多普勒维对齐拼接从而构建快速时频图;其次,对快速时频图进行归一化,达到杂波抑制和增强目标回波的目的,并基于归一化的快速时频图提取三种时频特征;然后,利用快速凸包学习算法训练给定虚警概率下的检测判决区域;最后,通过实测UAV数据验证并分析了所提方法的有效性。The detection of sea surface unmanned aerial vehicles(UAVs)belongs to the problem of small target detection in the background of sea clutter due to weak echoes,and joint multi-feature detection is an effective way to solve such problems.Aiming at the existing time-frequency(TF)tri-feature detection method,which has too much computational complexity in the feature extraction stage and is difficult to realize real-time detection,this paper proposes a fast TF-map-based multi-feature detection method for sea surface UAVs.First,the segmented FFT is performed on the radar complex echo data,and the computed Doppler amplitude spectrum is aligned and spliced along the Doppler dimension so as to construct a fast time-frequency map.Second,the fast TF map is normalized to achieve clutter suppression and enhancement of the target echoes,and three kinds of time-frequency features are extracted based on the normalized fast TF map.Third,the fast convex hull learning algorithm is utilized to train the decision judgement region under the given false alarm probability.Finally,the effectiveness of the proposed method is validated and analyzed by the measured UAV data.
分 类 号:TN957.5[电子电信—信号与信息处理] V279[电子电信—信息与通信工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.128.153.31