基于略读模块的神经快速阅读  

Neural fast reading based on skimming module

在线阅读下载全文

作  者:侯磊 蒙会民[1] 李旭[1] 兰振平[1] HOU Lei;MENG Huiming;LI Xu;LAN Zhenping(School of Information Science and Engineering, Dalian Polytechnic University, Dalian 116034, China)

机构地区:[1]大连工业大学信息科学与工程学院,辽宁大连116034

出  处:《大连工业大学学报》2022年第2期136-141,共6页Journal of Dalian Polytechnic University

基  金:辽宁省教育厅科学研究项目(J2020113).

摘  要:循环神经网络(RNN)广泛应用于时间序列的自然语言处理任务中,由于RNN网络的顺序特性,所有输入都会被读取到网络中,推理速度随输入长度的线性增加而减慢。为了解决这个问题,本研究提出一种基于略读模块的长短期记忆网络(LSTM)模型,在推理时快速读取答案相关内容,忽略重要句子中无关的词和文本中不重要的部分。本模型适用于文本分类、情感分析和阅读理解等多类文本任务,为了确保和标准LSTM精度相同,选取了5个同类快速阅读模型进行实验对比,实验结果表明,本模型消耗的浮点运算次数(FLOP)较少,阅读推理过程快。Recurrent neural networks(RNN)have been extensively used in natural language processing tasks with time sequence.Due to the sequential characteristics of the RNN,all inputs are read by the networks,and the inference speed slows down as the length of the input increases linearly.In order to solve this problem,this work proposes a long short-term memory network(LSTM)model based on skimming module,which can fast read the relevant content of the answers while reasoning,skipping irrelevant words in important sentences and also unimportant parts of the text.This model can apply to various text tasks,such as text classification,emotional analysis and reading comprehension.To ensure the same accuracy as the standard LSTM,five fast reading models of the same type are chosen for experimental comparison.It has been verified that this model performs a smaller number of floating-point operations(FLOP),and the reading and reasoning processes are faster.

关 键 词:机器阅读 长短期记忆网络模型 快速神经网络 强化学习 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象