PROXIMAL-PROXIMAL-GRADIENT METHOD  被引量:1

在线阅读下载全文

作  者:Ernest K.Ryu Wotao Yin 

机构地区:[1]Department of Mathematics,University of California,Los Angeles,CA 90095,USA

出  处:《Journal of Computational Mathematics》2019年第6期778-812,共35页计算数学(英文)

摘  要:In this paper,we present the proximal-proximal-gradient method(PPG),a novel optimization method that is simple to implement and simple to parallelize.PPG generalizes the proximal-gradient method and ADMM and is applicable to minimization problems written as a sum of many differentiable and many non-differentiable convex functions.The non-differentiable functions can be coupled.We furthermore present a related stochastic variation,which we call stochastic PPG(S-PPG).S-PPG can be interpreted as a generalization of Finito and MISO over to the sum of many coupled non-differentiable convex functions.We present many applications that can benefit from PPG and S-PPG and prove convergence for both methods.We demonstrate the empirical effectiveness of both methods through experiments on a CUDA GPU.A key strength of PPG and S-PPG is,compared to existing methods,their ability to directly handle a large sum of non-differentiable nonseparable functions with a constant stepsize independent of the number of functions.Such non-diminishing stepsizes allows them to be fast.

关 键 词:Proximal-gradient ADMM Finito MISO SAGA Operator splitting Firstorder methods Distributed OPTIMIZATION Stochastic OPTIMIZATION ALMOST sure CONVERGENCE linear CONVERGENCE 

分 类 号:O17[理学—数学]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象