检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:杨巨平
机构地区:[1]神华铁路货车运输有限责任公司,北京100011
出 处:《中国铁道科学》2016年第5期102-107,共6页China Railway Science
基 金:中国神华能源股份有限公司科技创新项目(SHGF-12-55)
摘 要:在铁路货车车轮超声波在线检测中,选取检测信号中具有一定宽度且独立的波峰信号作为识别对象,以相邻2个识别对象的相关系数最大值和信号背景比组成的识别特征向量作为感知器的输入,以按缺陷信号和非缺陷信号对检测信号分类的结果作为感知器的输出;从主要考虑消除漏检和其次考虑误检最少出发,以人工缺陷试块的检测信号组成训练样本,基于神经网络的感知器原理,结合现场检测要求,对传统的袋式缺陷识别算法进行改进,分别设置漏检计数器和误检计数器,通过样本训练调整识别特征参数的权重,从而实现缺陷信号的精确识别。实例验证结果表明:改进后的缺陷识别算法能够确保识别结果无漏检、误检最小。In the online ultrasonic inspection of railway freight car wheels,the independent peak signals with certain width were picked from detection signals and regarded as the objects of recognition.A new algorithm of perceptron was introduced,in which the input of recognition feature vector was composed by the maximum value of the correlation coefficients of 2adjacent recognition objects and the signal to background ratio,and the output was the classification result of detection signals which consisted of defect signals and non-defect signals.With the primary aim to eliminate missed detection and the secondary aim to minimize false detection,according to field detection requirements,a large amount of training samples composed by the detection signals of artificial defects were built to improve the traditional pocket defect recognition algorithm based on the perceptron principle of neural network.Missed detection counter and false detection counter was set respectively.The weight of recognition feature parameter was adjusted by sample training so as to realize the precise recognition of defect signals.Example verification results show that the improved defect recognition algorithm can ensure the recognition results with no missed detection and minimum false detection.
关 键 词:车轮探伤 缺陷识别 超声波检测 感知器算法 铁路货车
分 类 号:U270.331.1[机械工程—车辆工程] U279.32[交通运输工程—载运工具运用工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.33