检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:鲁晨光[1]
机构地区:[1]长沙大学
出 处:《通信学报》1994年第6期37-44,共8页Journal on Communications
摘 要:本文首先简要介绍了作者早些时候提出的预测熵、广义熵和广义互信息,然后讨论了预测熵、广义熵和广义互信息的编码意义,定义了限误差信息率R(AJ),证明了广义熵是给定误差限制AJ时的Shannon互信息的最小值,说明了广义熵、限误差信息率和信息率失真之间的关系。用广义信息量I(xi;yi)代替经典理论中的失真量d(xi,yi),从而把信息率失真论改造为保精度信息率论。文中同时提供了推导出的二元相似信源的保精度信息率函数,和上机计算出的以图像视觉为例的多元相似信源保精度信息率函数R(G),并提供了R(G)和信源量化等级及主观分辨率的关系。保精度信息率论不只是关于数据压缩的理论,同时也是关于优化客观信息和主观理解相匹配的理论。This paper first simply introduces predictive entropy, generalized entropy, and generalized mutual information proposed by the author earlier,then discusses the coding meanings of predictive entropy, generalized entropy and generalized mutual information, defines 'rate-of-limitingerrors' function R(AJ) , proves that the generalized entropy is the minimum of the Shannon mutual information under some given error-limiting, and explains the relationship between the generalized entropy, the rate-of-limiting-errors and rate-distortion. Replacing the distortion d (xi, yj)with the generalized information I(xi ;yj) ,we have the rate-of-keeping-precision theory, a revised version of the rate-distortion theory. Meanwhile, the paper provides the derived function R(G) of a binary source with a similarity relation, gives the function R(G) , which is calculated out on the computer, of a multiple source for visual information of image; and shows pictorially the relations between function R(G) , quantizing grades of a pixel, and subjective discrimination with fuzziness.The rate-of-keeping-precision theory is not only a theory for data compression, but also a theory for optimal matching between objective information and subjective understanding.
分 类 号:TN911.21[电子电信—通信与信息系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.231