检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Lei Chen Yuan Li Yong Lei Xingye Deng
机构地区:[1]School of Information and Electrical Engineering,Hunan University of Science and Technology,Xiangtan 411201,China [2]School of Computer Science and Engineering,Hunan University of Science and Technology,Xiangtan 411201,China
出 处:《Tsinghua Science and Technology》2024年第2期553-575,共23页清华大学学报(自然科学版(英文版)
基 金:supported by the National Key Research and Development Program(No.2019YFE0105300);the National Natural Science Foundation of China(No.62103143);the Hunan Province Key Research and Development Program(No.2022WK2006);the Special Project for the Construction of Innovative Provinces in Hunan(Nos.2020TP2018 and 2019GK4030);the Scientific Research Fund of Hunan Provincial Education Department(No.22B0471).
摘 要:Metapaths with specific complex semantics are critical to learning diverse semantic and structural information of heterogeneous networks(HNs)for most of the existing representation learning models.However,any metapaths consisting of multiple,simple metarelations must be driven by domain experts.These sensitive,expensive,and limited metapaths severely reduce the flexibility and scalability of the existing models.A metapath-free,scalable representation learning model,called Metarelation2vec,is proposed for HNs with biased joint learning of all metarelations in a bid to address this problem.Specifically,a metarelation-aware,biased walk strategy is first designed to obtain better training samples by using autogenerating cooperation probabilities for all metarelations rather than using expert-given metapaths.Thereafter,grouped nodes by the type,a common and shallow skip-gram model is used to separately learn structural proximity for each node type.Next,grouped links by the type,a novel and shallow model is used to separately learn the semantic proximity for each link type.Finally,supervised by the cooperation probabilities of all meta-words,the biased training samples are thrown into the shallow models to jointly learn the structural and semantic information in the HNs,ensuring the accuracy and scalability of the models.Extensive experimental results on three tasks and four open datasets demonstrate the advantages of our proposed model.
关 键 词:metarelation random walk heterogeneous network metapath representation learning
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15