检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]上海交通大学模式识别与图像处理研究所,上海200030
出 处:《光学学报》2004年第12期1633-1637,共5页Acta Optica Sinica
基 金:国家863计划(2002AA134020-05)上海市科委项目(015115036)资助课题
摘 要:提出了一个基于小波网格编码量化的超光谱图像压缩方法。谱间和空间冗余处理构成了超光谱图像压缩算法的主要内容,该算法使用一个谱间差分预测步骤来去除谱间冗余,而后对预测残差图像进行小波变换并利用均匀阈值网格编码量化(trellis-codedquantization)方法来量化各小波子带,最后使用自适应算术编码对量化码字进行熵编码。为使编码器能为所有子带获取率一失真意义上最优的量化阈值,设计了一个基于子带统计特性和网格编码量化器率一失真特性的比特分配算法。在实验中,该算法表现出优良的压缩性能,对于实验的超光谱图像,该方法在压缩比为32时可得到37.1dB的峰值信噪比,这表明本算法能有效压缩超光谱图像,适于超光谱图像压缩应用。An approach for compression of hyper-spectral images based on wavelet trellis-coded quantization is proposed. Processing of spectral and spatial redundancy make up the main ingredients of compression of hyper-spectral image. Firstly, the proposed algorithm takes advantage of spectral difference pulse code modulation (DPCM) to remove the spectral redundancy, then the discrete wavelet transform is carried out over the error images resulted from DPCM and trellis-coded quantization with uniform threshold value is adopted to quantize the sub band images. At last, entropy encoding of quantized code-words is performed by adaptive arithmetic encoding. To compute optimal quantization thresholds in rate-distortion sense for each sub-band of all spectral bands, an algorithm for bit allocation based on sub-band statistic characteristic and R-D characteristic of trellis-coded-quantization is also designed. In the experiments, excellent performance of the proposed algorithm is demonstrated. For the hyper-spectral image of experiment, the PSNR of the algorithm is 37.1 dB at the compression ratio of 32. This shows that the approach can efficiently compress hyper-spectral image and be suitable for the applications of hyper-spectral images compression.
关 键 词:信息光学 图像压缩 超光谱图像 小波编码 网格量化编码
分 类 号:TP751[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.70