检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
出 处:《小型微型计算机系统》2017年第10期2177-2181,共5页Journal of Chinese Computer Systems
基 金:国家自然科学基金项目(61201179;61571326)资助
摘 要:采用二通道方法输入图像,首先利用局部匹配方法生成低分辨率深度图像作为一通道输入图像,二通道输入自然低分辨率图像,然后提出联合稀疏表示模型对低分辨率的彩色和深度图像同时进行超分辨率重建.该方法是利用彩色图像与同场景深度图像的耦合相关性,通过聚类联合图像块来构造彩图和深度图的联合字典;然后构造彩色和深度图像块的多参数正则项,利用交替方向最小化算法求解模型,进而同时重建高分辨率的彩色和深度图像.为验证算法的有效性,我们在Middlebury数据集上对重建结果进行了主、客观评估并与不同算法做比较.实验结果表明,在客观指标和主观视觉效果上,提出的算法可以同时获得令人满意的彩图和高质量的深度图.通过Mean Shift算法对原图进行区域分割,对得到的视差图进行区域优化图像.This paper adopts two channel method of input image. First, we use local matching method to generate low resolution depth image as a channel of the input image, and natural low resolution images is input to the second channel . Then, we propose a novel joint sparse representation model to simultaneously recover the color image and depth image from their blurred and down-sampled version. This method uses the correspondence between the color image and the depth map of the same scene. The joint dictionary is learned through joint clustering image tiles construction. We then regularize the problem with multi-parameter regularization term of color and depth image tiles. Next, an alternating direction minimization algorithm is applied to solve the proposed model. It is able to restore the high-resolution color image and its corresponding depth image simultaneously. To evaluate the effectiveness of the proposed algo- rithm, we conduct several experiments on the Middlebury dataset and compare the proposed algorithm with different methods both in objective indices and subjective visual experience. The experimental results show that the proposed algorithm achieves satisfactory super-resolution results in both objective indexes and subjective visual comparisons.
关 键 词:超分辨率重建 稀疏表示 立体匹配 联合字典学习 正则优化
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222