机构地区:[1]浙江大学人工智能研究所,杭州310058 [2]浙江大学上海高等研究院,上海201203 [3]上海交通大学计算机科学与工程系,上海200241 [4]华东师范大学软件工程学院,上海200062 [5]淘宝(中国)软件有限公司,杭州310023 [6]山东大学软件学院,济南250011
出 处:《中国图象图形学报》2024年第6期1510-1534,共25页Journal of Image and Graphics
基 金:新一代人工智能国家科技重大专项(2022ZD0119100);国家自然科学基金项目(62037001,62441605);浙江省科技计划项目(2022C01044);繁星科学基金项目(浙江大学)。
摘 要:生成式基座大模型正在引发人工智能领域的重大变革,在自然语言处理、多模态理解与内容合成等任务展现通用能力。大模型部署于云侧提供通用智能服务,但面临时延大、个性化不足等关键挑战,小模型部署于端侧捕捉个性化场景数据,但存在泛化性不足的难题。大小模型端云协同技术旨在结合大模型通用能力和小模型专用能力,以协同交互方式学习演化进而赋能下游垂直行业场景。本文以大语言模型和多模态大模型为代表,梳理生成式基座大模型的主流架构、典型预训练技术和适配微调等方法,介绍在大模型背景下模型剪枝、模型量化和知识蒸馏等大模型小型化关键技术的发展历史和研究近况,依据模型间协作目的及协同原理异同,提出大小模型协同训练、协同推理和协同规划的协同进化分类方法,概述端云模型双向蒸馏、模块化设计和生成式智能体等系列代表性新技术、新思路。总体而言,本文从生成式基座大模型、大模型小型化技术和大小模型端云协同方式3个方面探讨大小模型协同进化的国际和国内发展现状,对比优势和差距,并从应用前景、模型架构设计、垂直领域模型融合、个性化和安全可信挑战等层面分析基座赋能发展趋势。Generative foundation models are facilitating significant transformations in the field of artificial intelligence.They demonstrate general artificial intelligence in diverse research fields,including natural language processing,multi⁃modal content understanding,imagery,and multimodal content synthesis.Generative foundation models often consist ofbillions or even hundreds of billions of parameters.Thus,they are often deployed on the cloud side to provide powerful andgeneral intelligent services.However,this type of service can be confronted with crucial challenges in practice,such ashigh latency induced by communications between the cloud and local devices,and insufficient personalization capabilitiesdue to the fact that servers often do not have access to local data considering privacy concerns.By contrast,low-complexitylightweight models are located at the edge side to capture personalized and dynamic scenario data.However,they may suf⁃fer from poor generalization.Large and lightweight(or large-small)model collaboration aims to integrate the general intelli⁃gence of large foundation models and the personalized intelligence of small lightweight models.This integration empowersdownstream vertical domain-specific applications through the interaction and collaboration of both types of intelligent mod⁃els.Large and small model collaboration has recently attracted increasing attention and becomes the focus of research anddevelopment in academia and industry.It has also been predicted to be an important trend in technology.We therefore tryto thoroughly investigate this area by highlighting recent progress and bringing potential inspirations for related research.Inthis study,we first overview representative large language models(LLMs)and large multimodal models.We focus on theirmainstream Transformer-based model architectures including encoder-only,decoder-only,and encoder-decoder models.Corresponding pre-training technologies such as next sentence prediction,sequence-to-sequence modeling,contrastivelearning,and
关 键 词:生成式大模型 大模型小型化 大小模型协同进化 端云协同进化 生成式智能体 生成式人工智能
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...