LogDA:Dual Attention-Based Log Anomaly Detection Addressing Data Imbalance  

在线阅读下载全文

作  者:Chexiaole Zhang Haiyan Fu 

机构地区:[1]School of Information Science and Technology,Hainan Normal University,Haikou,571158,China

出  处:《Computers, Materials & Continua》2025年第4期1291-1306,共16页计算机、材料和连续体(英文)

基  金:funded by the Hainan Provincial Natural Science Foundation Project(Grant No.622RC675);the National Natural Science Foundation of China(Grant No.62262019).

摘  要:As computer data grows exponentially,detecting anomalies within system logs has become increasingly important.Current research on log anomaly detection largely depends on log templates derived from log parsing.Word embedding is utilized to extract information from these templates.However,this method neglects a portion of the content within the logs and confronts the challenge of data imbalance among various log template types after parsing.Currently,specialized research on data imbalance across log template categories remains scarce.A dual-attention-based log anomaly detection model(LogDA),which leveraged data imbalance,was proposed to address these issues in the work.The LogDA model initially utilized a pre-trained model to extract semantic embedding from log templates.Besides,the similarity between embedding was calculated to discern the relationships among the various templates.Then,a Transformer model with a dual-attention mechanism was constructed to capture positional information and global dependencies.Compared to multiple baseline experiments across three public datasets,the proposed approach could improve precision,recall,and F1 scores.

关 键 词:Anomaly detection system log deep learning TRANSFORMER neural networks 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象