Universal Scaling Laws in Quantum-Probabilistic Machine Learning by Tensor Network: Toward Interpreting Representation and Generalization Powers  

在线阅读下载全文

作  者:Sheng-Chen Bai Shi-Ju Ran 

机构地区:[1]Center for Quantum Physics and Intelligent Sciences,Department of Physics,Capital Normal University,Beijing 100048,China

出  处:《Chinese Physics Letters》2024年第12期35-45,共11页中国物理快报(英文版)

基  金:supported in part by the Beijing Natural Science Foundation (Grant No. 1232025);the Ministry of Education Key Laboratory of Quantum Physics and Photonic Quantum Information (Grant No. ZYGX2024K020);Academy for Multidisciplinary Studies, Capital Normal University.

摘  要:The interpretation of representations and generalization powers has been a long-standing challenge in the fields of machine learning(ML)and artificial intelligence.This study contributes to understanding the emergence of universal scaling laws in quantum-probabilistic ML.We consider the generative tensor network(GTN)in the form of a matrix-product state as an example and show that with an untrained GTN(such as a random TN state),the negative logarithmic likelihood(NLL)L generally increases linearly with the number of features M,that is,L≃kM+const.This is a consequence of the so-called“catastrophe of orthogonality,”which states that quantum many-body states tend to become exponentially orthogonal to each other as M increases.This study reveals that,while gaining information through training,the linear-scaling law is suppressed by a negative quadratic correction,leading to L≃βM−αM^(2)+const.The scaling coefficients exhibit logarithmic relationships with the number of training samples and quantum channelsχ.The emergence of a quadratic correction term in the NLL for the testing(training)set can be regarded as evidence of the generalization(representation)power of the GTN.Over-parameterization can be identified by the deviation in the values ofαbetween the training and testing sets while increasingχ.We further investigate how orthogonality in the quantum-feature map relates to the satisfaction of quantum-probabilistic interpretation and the representation and generalization powers of the GTN.Unveiling universal scaling laws in quantum-probabilistic ML would be a valuable step toward establishing a white-box ML scheme interpreted within the quantum-probabilistic framework.

关 键 词:QUANTUM GENERALIZATION SCALING 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象