广义熵和广义互信息的编码意义  被引量:5

Meanings of Generalized Entropy and Generalized Mutual Information for Coding

在线阅读下载全文

作  者:鲁晨光[1] 

机构地区:[1]长沙大学

出  处:《通信学报》1994年第6期37-44,共8页Journal on Communications

摘  要:本文首先简要介绍了作者早些时候提出的预测熵、广义熵和广义互信息,然后讨论了预测熵、广义熵和广义互信息的编码意义,定义了限误差信息率R(AJ),证明了广义熵是给定误差限制AJ时的Shannon互信息的最小值,说明了广义熵、限误差信息率和信息率失真之间的关系。用广义信息量I(xi;yi)代替经典理论中的失真量d(xi,yi),从而把信息率失真论改造为保精度信息率论。文中同时提供了推导出的二元相似信源的保精度信息率函数,和上机计算出的以图像视觉为例的多元相似信源保精度信息率函数R(G),并提供了R(G)和信源量化等级及主观分辨率的关系。保精度信息率论不只是关于数据压缩的理论,同时也是关于优化客观信息和主观理解相匹配的理论。This paper first simply introduces predictive entropy, generalized entropy, and generalized mutual information proposed by the author earlier,then discusses the coding meanings of predictive entropy, generalized entropy and generalized mutual information, defines 'rate-of-limitingerrors' function R(AJ) , proves that the generalized entropy is the minimum of the Shannon mutual information under some given error-limiting, and explains the relationship between the generalized entropy, the rate-of-limiting-errors and rate-distortion. Replacing the distortion d (xi, yj)with the generalized information I(xi ;yj) ,we have the rate-of-keeping-precision theory, a revised version of the rate-distortion theory. Meanwhile, the paper provides the derived function R(G) of a binary source with a similarity relation, gives the function R(G) , which is calculated out on the computer, of a multiple source for visual information of image; and shows pictorially the relations between function R(G) , quantizing grades of a pixel, and subjective discrimination with fuzziness.The rate-of-keeping-precision theory is not only a theory for data compression, but also a theory for optimal matching between objective information and subjective understanding.

关 键 词:广义熵 广义互信息 信源编码 编码 

分 类 号:TN911.21[电子电信—通信与信息系统]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象