检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:赵振兵[1] 广泽晶[2] 高强[1] 王坤乾[3]
机构地区:[1]华北电力大学电气与电子工程学院,保定071003 [2]国网信息通信有限公司,北京100761 [3]华北电网有限公司,北京100053
出 处:《高电压技术》2013年第11期2642-2649,共8页High Voltage Engineering
基 金:中央高校基本科研业务费专项资金(12MS122)~~
摘 要:将变电设备的红外图像和可见光图像融合可大大提高热故障定位的准确度。鉴于小波变换缺乏方向信息,且不能为2维图像提供理想的稀疏表达,同时考虑到边缘特征在图像融合中的重要性,提出了一种基于轮廓波变换(contourlet transform,CT)域隐马尔可夫树(hidden Markov tree,HMT)模型的红外和可见光图像融合算法。利用所提出的算法,完成了某500kV变电站设备的红外和可见光图像融合。融合结果表明,该算法由于采用期望最大(expectation maximization,EM)算法对CT分解得到的高频系数进行HMT建模且设计了一种利用Canny算子进行边缘检测的融合规则,所以可在保留更多细节信息的同时,能得到更加光滑细腻的融合图像;融合结果的均值、标准差、平均梯度和信息熵等统计指标均有明显改善。该算法为变电设备的智能检测与识别提供了量化依据和指导。An algorithm of fusing infrared image and visible image is proposed for improving the accuracy of locating thermal faults. The algorithm is based on a hidden Markov tree (HMT) model in eontourlet transform (CT) domain, and fully takes into consideration that the edge features are essential in the image fusion. The proposed algorithm is used for fusing infrared and visible images taken from electricity transmission equipment in a 500 kV substation. The results show that, because the contourlet coefficients of the images are trained to a CT domain HMT model using the expectation maximization (EM) algorithm, and an edge-detection-based fusion rule using the Canny criteria is adopted, the proposed algorithm provides satisfactory fusion results, which is smooth while more image details are remained in terms of visual effect. Moreover, the objective evaluations of fusion results, such as mean, average gradient, standard deviation, and information entropy, are all significantly improved. Therefore, it is concluded that the proposed algorithm provides a quantitative basis and certain guidance for intelligent detection and recognition of electricity transmission equipment.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.249