期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
On the principles of Parsimony and Self-consistency for the emergence of intelligence 被引量:2
1
作者 Yi MA doris tsao Heung-Yeung SHUM 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2022年第9期1298-1323,共26页
Ten years into the revival of deep networks and artificial intelligence,we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general.We introduc... Ten years into the revival of deep networks and artificial intelligence,we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general.We introduce two fundamental principles,Parsimony and Self-consistency,which address two fundamental questions regarding intelligence:what to learn and how to learn,respectively.We believe the two principles serve as the cornerstone for the emergence of intelligence,artificial or natural.While they have rich classical roots,we argue that they can be stated anew in entirely measurable and computable ways.More specifically,the two principles lead to an effective and efficient computational framework,compressive closed-loop transcription,which unifies and explains the evolution of modern deep networks and most practices of artificial intelligence.While we use mainly visual data modeling as an example,we believe the two principles will unify understanding of broad families of autonomous intelligent systems and provide a framework for understanding the brain. 展开更多
关键词 INTELLIGENCE PARSIMONY SELF-CONSISTENCY Rate reduction Deep networks Closed-loop transcription
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部