检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张书晴 周一博 ZHANG Shuqing;Zhou Yibo(ZhengZhou University of Industrial Technology,Zhengzhou Henan 451100,China)
出 处:《信息与电脑》2023年第23期35-37,共3页Information & Computer
摘 要:深度学习技术的迅猛发展推动了对大规模数据集高效训练的需求,然而传统的深度学习训练策略在应对此挑战时显得效率不足。针对此问题,文章深入研究传统并行计算策略与基于数据并行的分布式深度学习训练策略,并提出一种基于异构计算资源的集群资源调度优化方法。实验证明,新方法相较于传统并行方法,在训练时间和计算资源利用率方面均表现出显著优势,能够为大规模深度学习任务的高效训练提供有力支持。The rapid development of deep learning technology has promoted the need for efficient training of largescale data sets.However,traditional deep learning training strategies are inefficient in meeting this challenge.To address this problem,this paper conducts an in-depth study of traditional parallel computing strategies and distributed deep learning training strategies based on data parallelism,and proposes a cluster resource scheduling optimization method based on heterogeneous computing resources.Experiments have shown that the new method has significant advantages over traditional parallel methods in terms of training time,computing resource utilization and performance improvement,providing strong support for efficient training of large-scale deep learning tasks.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.13