求解二次损失函数优化问题的分布式共轭梯度算法  

A distributed conjugate Gradient method for solving quadratic loss function optimization problems

在线阅读下载全文

作  者:于洁 孟文辉 Yu Jie;Meng Wenhui(School of Mathematics,Northwest University,Xi′an 710127,China)

机构地区:[1]西北大学数学学院,陕西西安710127

出  处:《纯粹数学与应用数学》2022年第1期116-126,共11页Pure and Applied Mathematics

基  金:国家自然科学基金(11201373);陕西省教育厅自然科学基金(14JK747).

摘  要:提出一种在分布式环境中利用共轭梯度法优化二次损失函数的算法,该算法利用本地子机器局部损失函数的一阶导数信息更新迭代点,在每次迭代中执行两轮通信,通过通信协作使主机器上的损失函数之和最小化.经过理论分析,证明该算法具有线性收敛性.在模拟数据集上与分布式交替方向乘子法进行对比,结果表明分布式共轭梯度算法更匹配于集中式性能.通过实验发现,增加子机器上的样本量不仅能提高收敛速度,也能降低计算误差.Aiming at the problem of large-scale data set optimization,this paper proposes an algorithm for optimizing the quadratic loss function using the conjugate gradient method in a distributed environment.The algorithm uses the first derivative information of the local sub-machine’s local loss function to update the iteration point,performs two rounds of communication in each iteration,and minimizes the sum of the loss function on the main machine through communication cooperation.After theoretical analysis,it is proved that the algorithm has linear convergence under appropriate conditions.Compared with the distributed alternating direction multiplier method on the simulated data set,the results show that the distributed conjugate gradient algorithm can match the centralized performance with fewer iterations than the distributed alternating direction multiplier method.It is found through experiments,Increasing the sample size on the sub-machine can not only increase the convergence speed,but also reduce the calculation error.

关 键 词:大数据 分布式优化 共轭梯度法 二次损失函数 线性收敛 

分 类 号:O224[理学—运筹学与控制论]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象