Enhanced Acceleration for Generalized Nonconvex Low-Rank Matrix Learning  

在线阅读下载全文

作  者:Hengmin Zhang Jian Yang Wenli Du Bob Zhang Zhiyuan Zha Bihan Wen 

机构地区:[1]School of Electrical and Electronic Engineering,Nanyang Technological University,Singapore 639798,Singapore [2]School of Computer Science and Engineering,Nanjing University of Science and Technology,Nanjing 210094,China [3]School of Information Science and Engineering,East China University of Science and Technology,Shanghai 200237,China [4]Department of Electrical and Computer Engineering,University of Macao,Macao 999078,China

出  处:《Chinese Journal of Electronics》2025年第1期98-113,共16页电子学报(英文版)

基  金:supported by the Ministry of Education,Republic of Singapore,through its Start-Up Grant and Academic Research Fund Tier 1(Grant No.RG61/22);in part by the National Natural Science Foundation of China(Grant No.61906067);the China Postdoctoral Science Foundation(Grant Nos.2019M651415 and 2020T13019)。

摘  要:Matrix minimization techniques that employ the nuclear norm have gained recognition for their applicability in tasks like image inpainting,clustering,classification,and reconstruction.However,they come with inherent biases and computational burdens,especially when used to relax the rank function,making them less effective and efficient in real-world scenarios.To address these challenges,our research focuses on generalized nonconvex rank regularization problems in robust matrix completion,low-rank representation,and robust matrix regression.We introduce innovative approaches for effective and efficient low-rank matrix learning,grounded in generalized nonconvex rank relaxations inspired by various substitutes for the?_(0)-norm relaxed functions.These relaxations allow us to more accurately capture low-rank structures.Our optimization strategy employs a nonconvex and multi-variable alternating direction method of multipliers,backed by rigorous theoretical analysis for complexity and convergence.This algorithm iteratively updates blocks of variables,ensuring efficient convergence.Additionally,we incorporate the randomized singular value decomposition technique and/or other acceleration strategies to enhance the computational efficiency of our approach,particularly for large-scale constrained minimization problems.In conclusion,our experimental results across a variety of image vision-related application tasks unequivocally demonstrate the superiority of our proposed methodologies in terms of both efficacy and efficiency when compared to most other related learning methods.

关 键 词:Nonconvex rank relaxations Alternating direction method of multipliers Robust matrix completion Low-rank representation Robust matrix regression Randomized singular value decomposition 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程] O241.6[自动化与计算机技术—控制科学与工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象