融合框架表示的汉语框架网词元扩充  

THE FUSION OF FRAME REPRESENTATION FOR CFN LEXICAL UNITS EXPANSION

在线阅读下载全文

作  者:任国华 吕国英[1] 李茹[1,2,3] 王燕 Ren Guohua;LüGuoying;Li Ru;Wang Yan(College of Computer and Information Technology,Shanxi University,Taiyuan 030006,Shanxi,China;Key Laboratory of Computer Intelligence and Chinese Information Processing of Ministry of Education,Shanxi University,Taiyuan 030006,Shanxi,China;Collaborative Innovation Center of Big Data Mining and Intelligent Technology in Shanxi,Taiyuan 030006,Shanxi,China)

机构地区:[1]山西大学计算机与信息技术学院,山西太原030006 [2]山西大学计算智能与中文信息处理教育部重点实验室,山西太原030006 [3]山西省大数据挖掘与智能技术协同创新中心,山西太原030006

出  处:《计算机应用与软件》2023年第4期122-127,146,共7页Computer Applications and Software

基  金:国家社会科学基金项目(18BYY009)。

摘  要:由于汉语框架网(CFN)的词元覆盖不全,使得CFN不能在大规模真实的文本中进行框架语义分析。框架语义学通过建立框架来解释词语的意义,框架与词语具有一定的语义相关性,而现有的方法在词元扩充任务中往往忽略了这种语义相关性。为此,提出一种融合框架表示的神经网络模型用于CFN词元扩充。利用双向LSTM对词语的词典释义和框架名进行建模,采用注意力机制得到与框架相关的词典释义表示;将框架语义表示和词典释义表示融合,从而得到词典中每个词的得分,输出得分高的词语。实验结果表明,该方法有效提高了CFN词元扩充的准确率,且优于基线模型。Due to the lack of lexical coverage of Chinese FrameNet(CFN),CFN cannot perform frame semantic analysis in large-scale real text.Frame semantics explains the meaning of words by establishing a frame.There is a certain semantic relevance between frames and words,and existing methods often ignore this semantic relevance in the task of lexical units expansion.Therefore,this paper proposes a neural network model of fusion frame representation for CFN lexical expansion.The model used the bidirectional LSTM to model the dictionary definitions and frame names of words,and used the attention mechanism to obtain the dictionary definitions related to the frame.The frame semantic representation and the dictionary definition representation were merged to obtain the score of each word in the dictionary and output the words with high scores.The experimental results show that this method effectively improves the accuracy of CFN lexical units expansion and is better than the baseline model.

关 键 词:CFN 词元扩充 语义相关性 注意力机制 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象