A Distributed Computing Framework Based on Lightweight Variance Reduction Method to Accelerate Machine Learning Training on Blockchain  被引量:1

在线阅读下载全文

作  者:Zhen Huang Feng Liu Mingxing Tang Jinyan Qiu Yuxing Peng 

机构地区:[1]Science and Technology on Parallel and Distributed Laboratory,National University of Defense Technology,Changsha 410000,China [2]H.R.Support Center,PLA,Beijing 100000,China

出  处:《China Communications》2020年第9期77-89,共13页中国通信(英文版)

基  金:partly supported by National Key Basic Research Program of China(2016YFB1000100);partly supported by National Natural Science Foundation of China(NO.61402490)。

摘  要:To security support large-scale intelligent applications,distributed machine learning based on blockchain is an intuitive solution scheme.However,the distributed machine learning is difficult to train due to that the corresponding optimization solver algorithms converge slowly,which highly demand on computing and memory resources.To overcome the challenges,we propose a distributed computing framework for L-BFGS optimization algorithm based on variance reduction method,which is a lightweight,few additional cost and parallelized scheme for the model training process.To validate the claims,we have conducted several experiments on multiple classical datasets.Results show that our proposed computing framework can steadily accelerate the training process of solver in either local mode or distributed mode.

关 键 词:machine learning optimization algorithm blockchain distributed computing variance reduction 

分 类 号:TP311.13[自动化与计算机技术—计算机软件与理论] TP181[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象