An Efficient Long Short-Term Memory Model for Digital Cross-Language Summarization  

在线阅读下载全文

作  者:Y.C.A.Padmanabha Reddy Shyam Sunder Reddy Kasireddy Nageswara Rao Sirisala Ramu Kuchipudi Purnachand Kollapudi 

机构地区:[1]Department of CSE,B V Raju Institute of Technology,Narsapur,Medak,T.S,502313,India [2]Department of IT,Vasavi College of Engineering,Hyderabad,T.S,500089,India [3]Department of CSE,K.S.R.M College of Engineering,Kadapa,A.P,516003,India [4]Department of IT,C.B.I.T,Gandipet,Hyderabad,Telangana,500075,India

出  处:《Computers, Materials & Continua》2023年第3期6389-6409,共21页计算机、材料和连续体(英文)

摘  要:The rise of social networking enables the development of multilingual Internet-accessible digital documents in several languages.The digital document needs to be evaluated physically through the Cross-Language Text Summarization(CLTS)involved in the disparate and generation of the source documents.Cross-language document processing is involved in the generation of documents from disparate language sources toward targeted documents.The digital documents need to be processed with the contextual semantic data with the decoding scheme.This paper presented a multilingual crosslanguage processing of the documents with the abstractive and summarising of the documents.The proposed model is represented as the Hidden Markov Model LSTM Reinforcement Learning(HMMlstmRL).First,the developed model uses the Hidden Markov model for the computation of keywords in the cross-language words for the clustering.In the second stage,bi-directional long-short-term memory networks are used for key word extraction in the cross-language process.Finally,the proposed HMMlstmRL uses the voting concept in reinforcement learning for the identification and extraction of the keywords.The performance of the proposed HMMlstmRL is 2%better than that of the conventional bi-direction LSTM model.

关 键 词:Text summarization reinforcement learning hidden markov model CROSS-LANGUAGE MULTILINGUAL 

分 类 号:TP391.1[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象