检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Andre Milzarek Xiantao Xiao Zaiwen Wen Michael Ulbrich
机构地区:[1]School of Data Science,The Chinese University of Hong Kong,Shenzhen,Shenzhen 518172,China [2]Shenzhen Research Institute of Big Data,Shenzhen 518172,China [3]Shenzhen Institute of Arti cial Intelligence and Robotics for Society,Shenzhen 518172,China [4]School of Mathematical Sciences,Dalian University of Technology,Dalian 116024,China [5]Beijing International Center for Mathematical Research,Peking University,Beijing 100871,China [6]Department of Mathematics,Technical University of Munich,Garching bei Munchen 85748,Germany
出 处:《Science China Mathematics》2022年第10期2151-2170,共20页中国科学:数学(英文版)
基 金:supported by the Fundamental Research Fund—Shenzhen Research Institute for Big Data Startup Fund(Grant No.JCYJ-AM20190601);the Shenzhen Institute of Artificial Intelligence and Robotics for Society;National Natural Science Foundation of China(Grant Nos.11831002 and 11871135);the Key-Area Research and Development Program of Guangdong Province(Grant No.2019B121204008);Beijing Academy of Artificial Intelligence。
摘 要:In this work,we present probabilistic local convergence results for a stochastic semismooth Newton method for a class of stochastic composite optimization problems involving the sum of smooth nonconvex and nonsmooth convex terms in the objective function.We assume that the gradient and Hessian information of the smooth part of the objective function can only be approximated and accessed via calling stochastic firstand second-order oracles.The approach combines stochastic semismooth Newton steps,stochastic proximal gradient steps and a globalization strategy based on growth conditions.We present tail bounds and matrix concentration inequalities for the stochastic oracles that can be utilized to control the approximation errors via appropriately adjusting or increasing the sampling rates.Under standard local assumptions,we prove that the proposed algorithm locally turns into a pure stochastic semismooth Newton method and converges r-linearly or r-superlinearly with high probability.
关 键 词:nonsmooth stochastic optimization stochastic approximation semismooth Newton method stochastic second-order information local convergence
分 类 号:O224[理学—运筹学与控制论]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.13