A Dual Attention Encoder-Decoder Text Summarization Model  

在线阅读下载全文

作  者:Nada Ali Hakami Hanan Ahmed Hosni Mahmoud 

机构地区:[1]Jazan University,Computer Science Department,College of Computer Science and Information Technology,Jazan,Saudi Arabia [2]Department of Computer Sciences,College of Computer and Information Sciences,Princess Nourah bint Abdulrahman University,P.O.Box 84428,Riyadh,11671,Saudi Arabia

出  处:《Computers, Materials & Continua》2023年第2期3697-3710,共14页计算机、材料和连续体(英文)

基  金:Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2022R113),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.

摘  要:A worthy text summarization should represent the fundamental content of the document.Recent studies on computerized text summarization tried to present solutions to this challenging problem.Attention models are employed extensively in text summarization process.Classical attention techniques are utilized to acquire the context data in the decoding phase.Nevertheless,without real and efficient feature extraction,the produced summary may diverge from the core topic.In this article,we present an encoder-decoder attention system employing dual attention mechanism.In the dual attention mechanism,the attention algorithm gathers main data from the encoder side.In the dual attentionmodel,the system can capture and producemore rational main content.The merging of the two attention phases produces precise and rational text summaries.The enhanced attention mechanism gives high score to text repetition to increase phrase score.It also captures the relationship between phrases and the title giving them higher score.We assessed our proposed model with or without significance optimization using ablation procedure.Our model with significance optimization achieved the highest performance of 96.7%precision and the least CPU time among other models in both training and sentence extraction.

关 键 词:Text summarization attention model phrase significance 

分 类 号:TP311[自动化与计算机技术—计算机软件与理论]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象