检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:邢毓华[1] 闫志恒 Xing Yuhua;Yan Zhiheng(School of Automation and Information Engineering,Xi’an University of Technology,Xi’an 710048,Shaanxi,China)
机构地区:[1]西安理工大学自动化与信息工程学院,陕西西安710048
出 处:《激光与光电子学进展》2022年第19期161-167,共7页Laser & Optoelectronics Progress
摘 要:针对混沌扩频时延估计中,使用现有方法运算结果峰值旁瓣比较低,且存在误判点较多,低信噪比导致线缆故障信号时延估计困难的问题,提出三阶累积量一维切片结合二次相关的时延估计新方法。将该方法应用于混沌时延估计中的Simulink仿真模型中,仿真结果表明,与现有的基本互相关、二次相关方法相比,该方法不仅可以在较低的信噪比下获得良好的估计效果,而且可以抑制相关背景噪声和非高斯噪声的干扰。其运算结果相比基本互相关运算结果主峰值旁瓣比绝对值增加了1.70 dB以上,误判峰值与故障点峰值比减少了0.133以上。该方法为混沌扩频检测光缆故障及其电缆故障提供了一种新的技术途径。The chaotic spread spectrum time delay estimation is a challenge posed by low peak parameterization when using existing methods.Furthermore,more potential misclassification spots and poor signaltonoise ratio(SNR)hinders cable fault signal time delay estimation.Therefore,this study proposes a new delay estimation approach that integrate the thirdorder cumulative onedimensional slicing with quadratic correlation.In chaotic time delay estimation,this method was used to the Simulink simulation model.The results obtained via the simulation show that the proposed method could obtain good estimation results at lower SNR while suppressing the interference of correlated background and nonGaussian noise when compared with existing basic intercorrelation and quadratic correlation methods.Its operation increases the absolute value of the primary peaktoparameter ratio by more than 1.70 dB and decreases the false peaktofault point peak ratio by more than 0.133 when compared with the basic intercorrelation operation.This method introduces a novel technical approach for detecting fiber optic cable faults and their cable faults using chaotic spread spectrum detection.
关 键 词:傅里叶光学与信号处理 时延估计 混沌扩频 三阶累积量 互相关
分 类 号:TN914.42[电子电信—通信与信息系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.33