基于重要性池化的层级图表示学习方法  被引量:2

Hierarchical representation learning method for graph based on importance pooling

在线阅读下载全文

作  者:张红梅[1,2] 李浩然[1] 张向利[1,2] ZHANG Hongmei;LI Haoran;ZHANG Xiangli(School of Information and Communication,Guilin University of Electronic Technology,Guilin 541004,China;Key Laboratory of Cloud Computing and Complex Systems in Guangxi Universities,Guilin University of Electronic Technology,Guilin 541004,China)

机构地区:[1]桂林电子科技大学信息与通信学院,广西桂林541004 [2]桂林电子科技大学广西高校云计算与复杂系统重点实验室,广西桂林541004

出  处:《桂林电子科技大学学报》2020年第4期300-304,共5页Journal of Guilin University of Electronic Technology

基  金:国家自然科学基金(61461010);认知无线电与信息处理省部共建教育部重点实验室基金(CRKL170103,CRKL170104);广西高校云计算与复杂系统重点实验室基金(YF16203)。

摘  要:为解决传统图神经网络在图分类任务上存在训练过程中噪声信息过多以及不能完整地挖掘图的层次表征信息等问题,提出一种端到端的基于重要性池化的层级图表示学习方法。该方法以层内-层间联合特征提取结构为基础,主要包括层内特征提取模块和层间特征提取模块2个部分。利用池化方法将图粗粒化为高级子图结构,以缩减特征图的尺寸;利用循环单元,以抑制层间噪声的传播,并自适应地聚集层级表征。实验结果表明,在合理的时间复杂度下,该方法能使损失函数收敛于更小的值,且模型精度有明显的提升。In order to solve the problems of traditional graph neural networks in graph classification tasks,such as excessive noisy information during training,and unable to deeply excavate hierarchical representation in graphs,an end-to-end method of hierarchical representation learning for graphs based on importance pooling is proposed,this method is based on the“intra layer-inter layer joint features extraction structure”,which mainly includes two parts:intra-layer features extraction module and inter-layer features extraction module.Using the pooling method to coarsely granulate the graph into a high-level subgraph structure to reduce the size of the feature map.Using the recurrent unit to suppress the propagation of the noise between layers and adaptively aggregate hierarchical representation of graphs.The result of the experiment show that under the reasonable time complexity,this method could make the loss function converges to a smaller value,and the model accuracy is significantly improved.

关 键 词:图神经网络 重要性池化 层级表示学习 

分 类 号:TP311[自动化与计算机技术—计算机软件与理论]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象