Enhanced Topic-Aware Summarization Using Statistical Graph Neural Networks  

在线阅读下载全文

作  者:Ayesha Khaliq Salman Afsar Awan Fahad Ahmad Muhammad Azam Zia Muhammad Zafar Iqbal 

机构地区:[1]Department of Computer Science,University of Agriculture Faisalabad,Faisalabad,Punjab,37300,Pakistan [2]School of Computing,Faculty of Technology,University of Portsmouth,Southsea,Portsmouth,PO12UP,UK [3]Department of Mathematics and Statistics,University of Agriculture Faisalabad,Faisalabad,Punjab,37300,Pakistan

出  处:《Computers, Materials & Continua》2024年第8期3221-3242,共22页计算机、材料和连续体(英文)

摘  要:The rapid expansion of online content and big data has precipitated an urgent need for efficient summarization techniques to swiftly comprehend vast textual documents without compromising their original integrity.Current approaches in Extractive Text Summarization(ETS)leverage the modeling of inter-sentence relationships,a task of paramount importance in producing coherent summaries.This study introduces an innovative model that integrates Graph Attention Networks(GATs)with Transformer-based Bidirectional Encoder Representa-tions from Transformers(BERT)and Latent Dirichlet Allocation(LDA),further enhanced by Term Frequency-Inverse Document Frequency(TF-IDF)values,to improve sentence selection by capturing comprehensive topical information.Our approach constructs a graph with nodes representing sentences,words,and topics,thereby elevating the interconnectivity and enabling a more refined understanding of text structures.This model is stretched to Multi-Document Summarization(MDS)from Single-Document Summarization,offering significant improvements over existing models such as THGS-GMM and Topic-GraphSum,as demonstrated by empirical evaluations on benchmark news datasets like Cable News Network(CNN)/Daily Mail(DM)and Multi-News.The results consistently demonstrate superior performance,showcasing the model’s robustness in handling complex summarization tasks across single and multi-document contexts.This research not only advances the integration of BERT and LDA within a GATs but also emphasizes our model’s capacity to effectively manage global information and adapt to diverse summarization challenges.

关 键 词:SUMMARIZATION graph attention network bidirectional encoder representations from transformers Latent Dirichlet Allocation term frequency-inverse document frequency 

分 类 号:TP391.41[自动化与计算机技术—计算机应用技术] TP183[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象