Efficient Parameterization for Knowledge Graph Embedding Using Hierarchical Attention Network  

在线阅读下载全文

作  者:Zhen-Yu Chen Feng-Chi Liu Xin Wang Cheng-Hsiung Lee Ching-Sheng Lin 

机构地区:[1]Master Program of Digital Innovation,Tunghai University,Taichung,40704,Taiwan [2]Department of Statistics,Feng Chia University,Taichung,40724,Taiwan [3]College of Integrated Health Sciences and the AI Plus Institute,The University at Albany,State University of New York(SUNY),Albany,NY 12222,USA

出  处:《Computers, Materials & Continua》2025年第3期4287-4300,共14页计算机、材料和连续体(英文)

基  金:supported by the National Science and Technology Council(NSTC),Taiwan,under Grants Numbers 112-2622-E-029-009 and 112-2221-E-029-019.

摘  要:In the domain of knowledge graph embedding,conventional approaches typically transform entities and relations into continuous vector spaces.However,parameter efficiency becomes increasingly crucial when dealing with large-scale knowledge graphs that contain vast numbers of entities and relations.In particular,resource-intensive embeddings often lead to increased computational costs,and may limit scalability and adaptability in practical environ-ments,such as in low-resource settings or real-world applications.This paper explores an approach to knowledge graph representation learning that leverages small,reserved entities and relation sets for parameter-efficient embedding.We introduce a hierarchical attention network designed to refine and maximize the representational quality of embeddings by selectively focusing on these reserved sets,thereby reducing model complexity.Empirical assessments validate that our model achieves high performance on the benchmark dataset with fewer parameters and smaller embedding dimensions.The ablation studies further highlight the impact and contribution of each component in the proposed hierarchical attention structure.

关 键 词:Knowledge graph embedding parameter efficiency representation learning reserved entity and relation sets hierarchical attention network 

分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象