Information Divergence and the Generalized Normal Distribution:A Study on Symmetricity  

在线阅读下载全文

作  者:Thomas L.Toulias Christos P.Kitsos 

机构地区:[1]Department of Electrical and Electronics Engineering,University of West Attica,Campus 1,Egaleo,12243 Athens,Greece [2]Department of Informatics and Computer Engineering,University of West Attica,Campus 1,Egaleo,12243 Athens,Greece

出  处:《Communications in Mathematics and Statistics》2021年第4期439-465,共27页数学与统计通讯(英文)

摘  要:This paper investigates and discusses the use of information divergence,through the widely used Kullback–Leibler(KL)divergence,under the multivariate(generalized)γ-order normal distribution(γ-GND).The behavior of the KL divergence,as far as its symmetricity is concerned,is studied by calculating the divergence of γ-GND over the Student’s multivariate t-distribution and vice versa.Certain special cases are also given and discussed.Furthermore,three symmetrized forms of the KL divergence,i.e.,the Jeffreys distance,the geometric-KL as well as the harmonic-KL distances,are computed between two members of the γ-GND family,while the corresponding differences between those information distances are also discussed.

关 键 词:Kullback-Leibler divergence Jeffreys distance Resistor-average distance Multivariateγ-order normal distribution Multivariate Student’s t-distribution Multivariate Laplace distribution 

分 类 号:O17[理学—数学]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象