检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]陆军军官学院,安徽合肥230031 [2]安徽水利水电职业技术学院,安徽合肥231603
出 处:《激光与红外》2016年第9期1133-1138,共6页Laser & Infrared
摘 要:采用激光技术干扰电视制导系统可以使其光电探测设备降低或失去目标探测、识别与跟踪的能力,客观、准确地评估激光对CCD成像系统的干扰程度有利于指导光电对抗设备的研制和探测设备的激光防护。从激光干扰后图像目标特征的失效程度出发,结合压缩感知理论中可利用测量矩阵直接获取稀疏信号的特征信息的特性,提出了一种基于压缩感知的激光干扰效果评估方法。采用非下采样Contourlet变换对图像进行稀疏变换,根据干扰前后图像压缩感知特征的变化情况来评估激光的干扰效果。实验结果表明,与传统的评估方法 MSE、SSIM相比,提出的基于压缩感知的评估方法对不同激光功率和不同探测角度下的图像都给出了合理的评估结果,能够克服不同探测角度对评估结果的影响,且对不同干扰功率的敏感度更高。Laser technology is adopted to jam the TV guidance system,which can result in the performance degradation of its photoelectric detector,so the objective and accurate evaluation of the laser interference degree to the CCD imaging system will make for the development of the photoelectric countermeasure equipment and laser protection of the detector.Starting from the invalidation degree of target image features after laser interference,and combining the character of the compressed sensing (CS)theory that the feature information of sparse signal can be obtained directly by measurement matrix,a laser interference effect evaluation method based on CS was put forward.The nonsubsampled contourlet transform (NSCT)was used as the sparse transform,and then the laser interference effect was evaluated according to the change of the image CS features before and after interference.The laser interference experiment results show that the proposed method has good feasibility and rationality under different laser powers and different detection angles comparing with the classical image quality assessment method MSE (Mean Square Error)and SSIM (Structural Similarity).It can overcome the influence of different detection angles on the evaluation results,and its sensitivity to different disturbing powers is higher.
分 类 号:TN977[电子电信—信号与信息处理]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28