检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:刘海洲 高俊涛[1] LIU Hai-Zhou;GAO Jun-Tao(School of Computer&Information Technology,Northeast Petroleum University,Daqing 163318,China)
机构地区:[1]东北石油大学计算机与信息技术学院,大庆163318
出 处:《计算机系统应用》2024年第12期231-239,共9页Computer Systems & Applications
基 金:黑龙江省省属本科高校基本科研业务费(2022TSTD-03)。
摘 要:剩余时间预测能够帮助企业提升业务流程执行的质量和效率.尽管现有的深度学习方法在剩余时间预测上有一定提升,但在处理复杂业务流程时,仍面临时间特征利用不足和局部特征挖掘能力有限的问题,预测精度有待提高.为此,本研究提出了一种基于改进Transformer编码器模型的剩余时间预测方法.针对已有方法忽略事件时间特征以及难以捕捉局部依赖的不足,本研究在模型中引入了时间特征编码模块和局部依赖增强模块.时间编码模块通过嵌入学习和多粒度拼接方式,构建了富有语义且具判别力的事件时间表示.局部依赖增强模块采用卷积神经网络,在Transformer编码器之后提取轨迹前缀的局部细节特征.实验表明,融合时间特征和局部依赖增强可以提升复杂业务流程剩余时间的预测准确性.Remaining time prediction helps enterprises improve the quality and efficiency of business process execution.Although existing deep learning methods have shown improvement in remaining time prediction,they still face challenges when dealing with complex business processes.These challenges include insufficient utilization of time features and limited ability to extract local features,leaving room for improvement in prediction accuracy.This study proposes a remaining time prediction method based on the improved Transformer encoder model.Existing methods ignore event time features and struggle to capture local dependencies.To address these limitations,this study introduces a time feature encoding module and a local dependency enhancement module into the model.The time encoding module constructs a semantically rich and discriminative event time representation by embedding learning and multi-granularity concatenation.The local dependency enhancement module uses convolutional neural networks to extract fine-grained local features from the trajectory prefix after processing with the Transformer encoder.Experiments show that integrating time features and local dependency enhancement improves the prediction accuracy of the remaining time for complex business processes.
关 键 词:剩余时间预测 过程挖掘 深度学习 TRANSFORMER
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.117