检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:翟海保 王兴志 葛敏辉 杨争林[2] 冯树海[2] 刘宇航 Zhai Haibao;Wang Xingzhi;Ge Minhui;Yang Zhenglin;Feng Shuhai;Liu Yuhang(East China Division of State Grid Corporation of China, Shanghai 200120, China;China Electric Power Research Institute (Nanjing), Nanjing Jiangsu 210003, China;College of Electrical Engineering,Shanghai University of Electric Power,Shanghai 200090,China)
机构地区:[1]国家电网公司华东分部,上海200120 [2]中国电力科学研究院(南京),江苏南京210003 [3]上海电力大学电气工程学院,上海200090
出 处:《电气自动化》2021年第2期105-108,共4页Electrical Automation
基 金:国家电网公司科技项目(520800180004)。
摘 要:针对电网输电线路发生故障较多、告警系统误报率较高且依赖运维人员事后分析的问题,提出了基于改进卷积神经网络(CNN)的电网输电线路故障诊断模型。首先对电网输电线路的电流时序数据进行预处理,然后通过双通道融合和多层卷积、池化改进卷积神经网络,并在卷积层中结合批归一化方法,对故障数据和正常调停数据分别进行特征提取,再通过soft-max分类器进行分类识别,构建了智能高效的故障诊断模型,有效地降低了误报率。最后利用国家电网调度中心实际数据,验证了所提方法的有效性。Aiming at such problems as frequent faults on grid transmission lines,high false alarm rate of warning system and reliance on ex post analysis of operation/maintenance personnel,a grid transmission line fault diagnosis model based on improved convolutional neural network(CNN)was proposed.Firstly,the electric current time series data of the grid transmission line were preprocessed,the convolutional neural network was improved through dual-channel fusion,multi-layer convolution and pooling,and the batch normalization method was applied in the convolution layer to extract characteristics from fault data and normal mediation data respectively.Furthermore,classification and identification were made by use of the soft-max classifier,and an efficient intelligent fault diagnosis model was constructed to effectively reduce the false alarm rate.Finally,actual data from the state grid dispatch center verified the effectiveness of the proposed method.
关 键 词:输电线路 误报率 卷积神经网络 双通道融合 故障诊断
分 类 号:TM734[电气工程—电力系统及自动化]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.46