融入注意力机制的越南语组块识别方法  被引量:1

Vietnamese Chunk Identification Incorporating Attention Mechanism

在线阅读下载全文

作  者:王闻慧 毕玉德 雷树杰 WANG Wenhui;BI Yude;LEI Shujie(Luoyang Division,Information Engineering University,Luoyang,Henan 471003,China;College of For&gn Language and Literature,Fudan University,Shanghai 200433,China)

机构地区:[1]信息工程大学洛阳校区,河南洛阳471003 [2]复旦大学外国语言文学学院,上海200433

出  处:《中文信息学报》2019年第12期91-100,共10页Journal of Chinese Information Processing

摘  要:对于越南语组块识别任务,在前期对越南语组块内部词性构成模式进行统计调查的基础上,该文针对Bi-LSTM+CRF模型提出了两种融入注意力机制的方法:一是在输入层融入注意力机制,从而使得模型能够灵活调整输入的词向量与词性特征向量各自的权重;二是在Bi-LSTM之上加入了多头注意力机制,从而使模型能够学习到Bi-LSTM输出值的权重矩阵,进而有选择地聚焦于重要信息。实验结果表明,在输入层融入注意力机制后,模型对组块识别的F值提升了3.08%,在Bi-LSTM之上加入了多头注意力机制之后,模型对组块识别的F值提升了4.56%,证明了这两种方法的有效性。For the Vietnamese chunk identification task,this paper proposes two ways to integrate the attention mechanism with the Bi-LSTM+CRF model.The first is to integrate the attention mechanism at the input layer,which allows the model to flexibly adjust weights of word embeddings and POS feature embeddings.The second is to add a multi-head attention mechanism on the top of Bi-LSTM,which enables the model to learn weight matrix of the Bi-LSTM outputs and selectively focus on important information.Experimental results show that,after integrating the attention mechanism at the input layer,the F-value of Vietnamese chunk identification is increased by 3.08%;and after adding the multi-head attention mechanism on the top of Bi-LSTM,the F-value of Vietnamese chunk identification is improved by 4.56%.

关 键 词:越南语 组块识别 Bi-LSTM+CRF模型 注意力机制 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象