检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Muning WEN Runji LIN Hanjing WANG Yaodong YANG Ying WEN Luo MAI Jun WANG Haifeng ZHANG Weinan ZHANG
机构地区:[1]School of Electronics Information and Electrical Engineering,Shanghai Jiao Tong University,Shanghai 200241,China [2]Digital Brain Lab,Shanghai 201306,China [3]Institute of Automation,Chinese Academy of Sciences,Beijing 100190,China [4]School of Artificial Intelligence,University of Chinese Academy of Sciences,Beijing 100049,China [5]Institute for Artificial Intelligence,Peking University,Beijing 100091,China [6]School of Informatics,The University of Edinburgh,Edinburgh EH89JU,UK [7]Department of Computer Science,University College London,London WC1E 6BT,UK
出 处:《Frontiers of Computer Science》2023年第6期25-42,共18页中国计算机科学前沿(英文版)
基 金:The SJTU team was partially supported by“New Generation of AI 2030”Major Project(2018AAA0100900);Shanghai Municipal Science and Technology Major Project(2021SHZDZX0102);the National Natural Science Foundation of China(Grant No.62076161);Muning Wen is supported by Wu Wen Jun Honorary Scholarship,AI Institute,Shanghai Jiao Tong University.
摘 要:Transformer architectures have facilitated the development of large-scale and general-purpose sequence models for prediction tasks in natural language processing and computer vision,e.g.,GPT-3 and Swin Transformer.Although originally designed for prediction problems,it is natural to inquire about their suitability for sequential decision-making and reinforcement learning problems,which are typically beset by long-standing issues involving sample efficiency,credit assignment,and partial observability.In recent years,sequence models,especially the Transformer,have attracted increasing interest in the RL communities,spawning numerous approaches with notable effectiveness and generalizability.This survey presents a comprehensive overview of recent works aimed at solving sequential decision-making tasks with sequence models such as the Transformer,by discussing the connection between sequential decision-making and sequence modeling,and categorizing them based on the way they utilize the Transformer.Moreover,this paper puts forth various potential avenues for future research intending to improve the effectiveness of large sequence models for sequential decision-making,encompassing theoretical foundations,network architectures,algorithms,and efficient training systems.
关 键 词:SEQUENTIAL DECISION-MAKING SEQUENCE modeling the TRANSFORMER TRAINING system
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15