Graph distillation with network symmetry  

在线阅读下载全文

作  者:Feng Lin Jia-Lin He 林峰;何嘉林

机构地区:[1]China West Normal University,Nanchong 637000,China [2]The Internet of Things Perception and Big Data Analysis Key Laboratory of Nanchong City,Nanchong 637000,China [3]Institute of Artificial Intelligence,China West Normal University,Nanchong 637000,China

出  处:《Chinese Physics B》2025年第4期262-271,共10页中国物理B(英文版)

基  金:Project supported by the National Natural Science Foundation of China(Grant No.62176217);the Program from the Sichuan Provincial Science and Technology,China(Grant No.2018RZ0081);the Fundamental Research Funds of China West Normal University(Grant No.17E063).

摘  要:Graph neural networks(GNNs)have demonstrated excellent performance in graph representation learning.However,as the volume of graph data grows,issues related to cost and efficiency become increasingly prominent.Graph distillation methods address this challenge by extracting a smaller,reduced graph,ensuring that GNNs trained on both the original and reduced graphs show similar performance.Existing methods,however,primarily optimize the feature matrix of the reduced graph and rely on correlation information from GNNs,while neglecting the original graph’s structure and redundant nodes.This often results in a loss of critical information within the reduced graph.To overcome this limitation,we propose a graph distillation method guided by network symmetry.Specifically,we identify symmetric nodes with equivalent neighborhood structures and merge them into“super nodes”,thereby simplifying the network structure,reducing redundant parameter optimization and enhancing training efficiency.At the same time,instead of relying on the original node features,we employ gradient descent to match optimal features that align with the original features,thus improving downstream task performance.Theoretically,our method guarantees that the reduced graph retains the key information present in the original graph.Extensive experiments demonstrate that our approach achieves significant improvements in graph distillation,exhibiting strong generalization capability and outperforming existing graph reduction methods.

关 键 词:graph neural networks graph distillation network symmetry super nodes feature optimization 

分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象