基于SSA-Bi-LSTM神经网络的母线负荷预测方法  被引量:9

Bus Load Prediction Method Based on SSA-Bi-LSTM Neural Network

在线阅读下载全文

作  者:胡如乐 陈逸枞 张大海[2] 张沛 王舒杨 喻芸 HU Rule;CHEN Yicong;ZHANG Dahai;ZHANG Pei;WANG Shuyang;YU Yun(CSG Research Institute Co.,Ltd.,Guangzhou,Guangdong 510663,China;School of Electrical Engineering,Beijing Jiaotong University,Beijing 100044,China;Tianjin Hongyuan Smart Energy Co.,Ltd.Tianjin 300010,China)

机构地区:[1]南方电网数字电网研究院有限公司,广东广州510663 [2]北京交通大学电气工程学院,北京100044 [3]天津弘源慧能科技有限公司,天津300010

出  处:《广东电力》2022年第2期19-26,共8页Guangdong Electric Power

基  金:中国南方电网有限责任公司科技项目(670000KK52200152);国家重点研发计划项目(2016YFB0900600)。

摘  要:为了提高母线负荷预测精度,针对长短期记忆(long short term memory,LSTM)神经网络在母线负荷预测时存在对负荷规律提取不足导致精度下降、超参数设置依赖经验等问题,首先构建LSTM神经网络的变体网络———双向长短期记忆(Bi-directional LSTM,Bi-LSTM)神经网络,捕获时间序列未来可用的信息。然后采用麻雀搜索算法(sparrow search algorithm,SSA)搜索最优超参数,得到最优学习率、隐层神经元数目和迭代次数等。以实际10kV母线数据对SSA-Bi-LSTM神经网络模型进行验证,并与Bi-LSTM神经网络和BP神经网络进行对比,结果表明SSA-Bi-LSTM神经网络模型的预测效果更佳。To improve prediction accuracy and solve the problems of the neural network of long short term memory(LSTM)such as accuracy reduction and setting of the hyperparameters depending on experience due to insufficient extraction of load rules in the process of bus load prediction,this paper firstly constructs the Bi-directional LSTM network as the variant network of LSTM to capture the time series of the future available information.Then by using the sparrow search algorithm(SSA)to search the optimal hyperparameters,it obtains the optimal learning rate,the numbers of hidden layer neurons and iterations.Taking an actual 10 kV bus data as examples,it carries out verification of the neural network model of SSA-Bi.LSTM and compares with the neural network of Bi-LSTM and BP.The results show that the prediction effect of the neural network model of SSA-Bi-LSTM is better.

关 键 词:母线负荷 双向长短期记忆神经网络 负荷预测 麻雀搜索算法 长短期记忆神经网络 

分 类 号:TM715.1[电气工程—电力系统及自动化]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象