On the principles of Parsimony and Self-consistency for the emergence of intelligence  被引量:2

在线阅读下载全文

作  者:Yi MA Doris TSAO Heung-Yeung SHUM 

机构地区:[1]Electrical Engineering and Computer Science Department,University of California,Berkeley,CA,94720,USA [2]Department of Molecular&Cell Biology and Howard Hughes Medical Institute,University of California,Berkeley,CA,94720,USA [3]International Digital Economy Academy,Shenzhen,518045,China

出  处:《Frontiers of Information Technology & Electronic Engineering》2022年第9期1298-1323,共26页信息与电子工程前沿(英文版)

摘  要:Ten years into the revival of deep networks and artificial intelligence,we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general.We introduce two fundamental principles,Parsimony and Self-consistency,which address two fundamental questions regarding intelligence:what to learn and how to learn,respectively.We believe the two principles serve as the cornerstone for the emergence of intelligence,artificial or natural.While they have rich classical roots,we argue that they can be stated anew in entirely measurable and computable ways.More specifically,the two principles lead to an effective and efficient computational framework,compressive closed-loop transcription,which unifies and explains the evolution of modern deep networks and most practices of artificial intelligence.While we use mainly visual data modeling as an example,we believe the two principles will unify understanding of broad families of autonomous intelligent systems and provide a framework for understanding the brain.

关 键 词:INTELLIGENCE PARSIMONY SELF-CONSISTENCY Rate reduction Deep networks Closed-loop transcription 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象