基于Transformer与改进记忆机制的用电量预测研究  

Research on electricity consumption prediction based on Transformer and Improved memory mechanism

在线阅读下载全文

作  者:蔡岳[1] 张津铭 郭晶 徐玉华[1] 孙知信[1] CAI Yue;ZHANG Jin-ming;GUO Jing;XU Yu-hua;SUN Zhi-xin(Nanjing University of Posts and Telecommunications School of Modern Posts Post Big Data Technology and Application Engineering Research Center of Jiangsu Province,Post Industry Technology Research and Development Center of the State Posts Bureau(Internet of Things Technology),Key Laboratory of Broadband Wireless Communication and Sensor Network Technology,Ministry of Education,Nanjing 210000,China;State Grid Information and Communication Industry Group,Beijing 102211,China;Aostar Information Technologies Co.,Ltd.,Chengdu 610041,China)

机构地区:[1]南京邮电大学现代邮政学院,江苏省邮政大数据技术与应用工程研究中心,国家邮政局邮政行业技术研发中心(物联网技术),宽带无线通信与传感网技术教育部重点实验室,南京210000 [2]国网信息通信产业集团,北京102211 [3]四川中电启明星信息技术有限公司,成都610041

出  处:《信息技术》2024年第6期67-74,79,共9页Information Technology

基  金:国家自然科学基金(61972208)。

摘  要:近年来我国经济的高速发展对电力配置提出了更高要求,实现电力资源的高效配置需要更加精准的用电量预测。随着人工智能、机器学习等技术的发展,高效精准的用电量预测成为可能。目前该领域普遍使用Long Short-Term Memory (LSTM)及其变种模型,但准确度相对较低。文中提出了一种基于改进记忆机制与Transformer的用电量预测模型,使用Transformer编码输入,提出了一种新型记忆机制来执行预测。实验表明该方法相较随机森林回归和LSTM及其变种模型,一周内平均误差分别下降9.05%与5.32%,模型收敛速度更快且具有较好的泛化性能。In recent years,the rapid development of our country’s economy has put forward higher requirements for power allocation.To achieve efficient allocation of power resources requires more accurate power consumption forecasting.With the development of artificial intelligence,machine learning and other technologies,efficient and accurate electricity consumption forecasting becomes possible.At present,Long Short-Term Memory(LSTM)and its variant models are commonly used in this field,but the accuracy of this method is relatively low.This paper proposes a power consumption prediction model based on improved memory mechanism and Transformer.The method uses a Transformer to encode the input and proposes a novel memory mechanism to realize predictions.Experiments show that compared with random forest regression and LSTM and its variant models,the average error of this method decreases by 9.05%and 5.32%respectively within one week,and the model converges faster and has better generalization performance.

关 键 词:记忆网络 TRANSFORMER 时序预测 机器学习 长短期记忆 

分 类 号:TP391.9[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象