检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陈孟强 颜子杰 叶彦 吴维刚[1] CHEN Meng-qiang;YAN Zi-jie;YE Yan;WU Wei-gang(School of Data and Computer Science,Sun Yat-sen University,Guangzhou 510006,China)
机构地区:[1]中山大学数据科学与计算机学院,广东广州510006
出 处:《计算机工程与科学》2018年第A01期133-140,共8页Computer Engineering & Science
基 金:国家重点研发计划(2016YFB0200404);国家自然科学基金(U1711263)
摘 要:深度学习已被广泛应用于各领域,尤其是大数据分析,然而深度学习所需的计算越来越复杂,规模也越来越大。为了加速大规模深度学习的训练,学术界已经提出了各种分布式并行训练协议。设计了一种新的异步训练协议——加权异步并行协议(WASP),以更有效的方式更新神经网络参数。WASP的核心是对"陈旧梯度"的处理,即基于参数版本号来衡量梯度陈旧性并减少陈旧梯度对参数的影响。此外,通过周期性强制同步模型参数,WASP结合了同步和异步并行协议的优点,可以快速收敛并提高模型训练速度。我们在天河二号超级计算机上使用两个经典卷积神经网络LeNet-5和ResNet-101进行实验,结果表明,WASP可以比现有异步并行训练协议取得更高的加速比、更稳定的收敛。Deep learning technology has been widely applied for various purposes, especially big data analysis. However, computation demand for deep learning is getting more complex and larger. In order to accelerate the training of large-scale deep networks, various distributed parallel training protocols have been proposed. We design a novel asynchronous training protocol, called weighted asynchronous parallel protocol (WASP), to update neural network parameters in a more effective way. The core of WASP is how to deal with “gradient staleness”, a parameter version number based metric to weight gradients and reduce the influence of the stale gradient on parameters. Moreover, by periodic forced synchronization of model parameters, the WASP combines the advantages of synchronous and asynchronous training models and can speed up training with a rapid convergence rate. We conduct experiments on the Tianhe-2 supercomputing system using two classical convolutional neural networks, LeNet-5 and ResNet-101, and the results show that the WASP can achieve a much higher speedup and a more stable convergence than existing asynchronous parallel training protocols.
关 键 词:深度学习 分布式并行 天河二号 参数服务器 陈旧度
分 类 号:TP393.027[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.46