检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张明[1] 廖希 ZHANG Ming;LIAO Xi(School of Software,Chengdu Polytechnic,Chengdu 610041,China;School of Communication and Information Engineering,Chongqing University of Posts and Telecommunication,Chongqing 400065,China)
机构地区:[1]成都职业技术学院软件学院,成都610041 [2]重庆邮电大学通信与信息工程学院,重庆400065
出 处:《西南大学学报(自然科学版)》2024年第10期222-232,共11页Journal of Southwest University(Natural Science Edition)
基 金:国家自然科学基金项目(61801062);重庆市自然科学基金项目(cstc2021Jcyj-msxmX0634)。
摘 要:介词结构的分析难点在于如何对介词及其结构进行有效分类,挖掘其语义信息,并对介词结构进行有效的消歧处理.为了应对这一难题,结合人工智能和神经网络技术,提出一种基于长短期记忆和注意力机制的树递归神经网络模型,旨在解决自然语言处理中的上下文介词消歧问题.该模型通过引入注意力机制,将模型注意力集中在与介词含义相关的关键信息上.首先,通过嵌入上下文解析树和上下文词向量,捕捉上下文词汇之间的语义关系.然后,采用带有长短期记忆功能的树递归神经网络(Long Short-Term Memory Tree Recurrent Neural Network, Tree-LSTM)模型为树中的每个节点生成隐藏特征,并递归地跟踪树中不同分支上的传播来计算树节点的上下文表示.最后,为了减少噪声对上下文中与介词含义相关的关键信息的影响,引入注意机制卷积神经网络(Attention-based Convolutional Neural Network, ACNN),使模型专注于需要消除歧义的文档中的重要部分.这种方式使模型能够自动选择并关注与当前介词含义最相关的词汇,从而提高消歧准确性.实验结果表明:在Semeval 2013 Task 12词义消歧数据集上,该文提出的模型取得了88.04%的F1-score,优于现有主流深度学习模型,验证了该文方法的有效性.The analysis of prepositional structures presents challenges in effectively classifying prepositions and their structures,mining their semantic information,and effectively disambiguating prepositional structures.To address this challenge,this paper proposes an ACNN-Tree-LSTM model that combines artificial intelligence and neural network techniques,aiming to solve the problem of context-based preposition disambiguation in natural language processing.The core idea is to introduce an attention mechanism to focus the model's attention on key information in the context,which is relevant to the meaning of the preposition.In this study,the context parsing tree and context word embeddings were first embedded to capture the semantic relationships between context words.Then,the Tree-LSTM model was utilized to generate hidden features for each node in the tree,and the context representation of tree nodes was computed by recursively tracking propagation along different branches of the tree.Finally,to reduce the influence of noise on key information related to the meaning of the preposition in the context,an attention mechanism was introduced to enable the model to focus on the crucial parts of the reference document that require disambiguation.This approach allows the model to automatically select and pay attention to the vocabulary mostly relevant to the current prepositional meaning,thereby improving disambiguation accuracy.Experimental results on the Semeval 2013 Task 12 Word Sense Disambiguation dataset demonstrated that the proposed model achieved an F1-score of 88.04%,outperforming existing mainstream deep learning models and validating the effectiveness of the proposed approach.
关 键 词:人工智能 神经网络 介词消歧 深度学习 注意力机制
分 类 号:TP393[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.30