检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Yi MA Doris TSAO Heung-Yeung SHUM
机构地区:[1]Electrical Engineering and Computer Science Department,University of California,Berkeley,CA,94720,USA [2]Department of Molecular&Cell Biology and Howard Hughes Medical Institute,University of California,Berkeley,CA,94720,USA [3]International Digital Economy Academy,Shenzhen,518045,China
出 处:《Frontiers of Information Technology & Electronic Engineering》2022年第9期1298-1323,共26页信息与电子工程前沿(英文版)
摘 要:Ten years into the revival of deep networks and artificial intelligence,we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general.We introduce two fundamental principles,Parsimony and Self-consistency,which address two fundamental questions regarding intelligence:what to learn and how to learn,respectively.We believe the two principles serve as the cornerstone for the emergence of intelligence,artificial or natural.While they have rich classical roots,we argue that they can be stated anew in entirely measurable and computable ways.More specifically,the two principles lead to an effective and efficient computational framework,compressive closed-loop transcription,which unifies and explains the evolution of modern deep networks and most practices of artificial intelligence.While we use mainly visual data modeling as an example,we believe the two principles will unify understanding of broad families of autonomous intelligent systems and provide a framework for understanding the brain.
关 键 词:INTELLIGENCE PARSIMONY SELF-CONSISTENCY Rate reduction Deep networks Closed-loop transcription
分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7