检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:闫秀英[1] 门琪 吴晓雪 YAN Xiuying;MEN Qi;WU Xiaoxue(School of Building Services Science and Engineering,Xi’an University of Architecture and Technology,Xi’an 710055,China)
机构地区:[1]西安建筑科技大学建筑设备科学与工程学院,西安710055
出 处:《电力系统及其自动化学报》2025年第4期88-97,共10页Proceedings of the CSU-EPSA
基 金:陕西省自然科学基金资助项目(Z20220068)。
摘 要:为解决深度学习预测模型在数据不足时准确性受限的问题,提出一种结合Transformer的交叉注意力(cross-attention in Transformer,CATrans)机制和域分离网络(domain separation networks,DSN)的深度迁移学习方法——CATrans-DSN,用于短期跨建筑负荷预测。CATrans特征提取器利用注意力机制来学习源域和目标域负荷数据的域共有和私有时间特征,并利用共有特征进行知识迁移;特征重构器作为辅助模块,对源域和目标域数据进行数据重构;由回归预测器将学习到的特征转化为预测值。最后,利用在源域和目标域上训练得到的建筑负荷预测模型,直接用于目标建筑的负荷预测。实验结果表明,所提出的方法有效地提高了数据稀缺情况下的预测准确性和模型泛化能力。To address the issue of limited accuracy in deep learning prediction models due to insufficient data,a deep transfer learning method that integrates the cross-attention in Transformer(CATrans)and domain separation networks(DSN)is proposed,i.e.,CATrans-DSN,which can be used for short-term load forecasting cross buildings.The CATrans feature extractor utilizes the attention mechanism to learn domain-common and domain-specific temporal fea-tures from load data in both the source and target domains,and it also uses the common features for knowledge transfer.The feature reconstructor serves as an auxiliary module to reconstruct data for both the source and target domains.After-wards,a regression predictor translates the learned features into forecasted values.Finally,the building load fore-casting models trained on both the source and target domains are directly applied to the target building for load prediction.Experimental results demonstrate that the proposed method effectively improves the predictive accuracy and the model’s generalization capability under conditions of data scarcity.
分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222