检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张雷洪[1] 杜晓萌[1] 樊丽萍[1] 梁东[1] 赖河木
机构地区:[1]上海理工大学,上海200093
出 处:《包装工程》2015年第9期114-118,共5页Packaging Engineering
基 金:上海理工大学人文社会科学基金(1F-13-309-001)
摘 要:目的提出一种结合客观Itti模型及主观眼动模型进行图像增强的方法。方法将基于主观和客观人眼视觉特性获取的显著图用于直方图构造中,使用参数将利用客观Itti视觉注意计算模型及主观眼动实验获取数据生成的显著图相结合,基于新的加权系数对图像的相应区域进行调整来实现图像增强,并通过基于视觉感知的客观图像增强质量评价算法对其进行评价。结果基于视觉感知的客观评价算法表明,通过改变参数λ取值来调整客观Itti模型及主观眼动模型的比重,可以得到不同的增强效果,当参数λ=0.7时,提出的方法具有优于基于客观Itti模型和主观眼动模型的图像增强效果。结论主客观人眼视觉特性结合进行的图像增强优于传统的图像增强算法,具有满意的视觉效果。The aim of this study was to propose a method for enhancing the images based on the Itti objective model and eye movement subjective model. The saliency maps were obtained based on the subjective and objective human visual characteristics and used in the construction of histogram, and parameters were adopted to combine the Itti objective visual attention computational model and the saliency maps generated based on the data obtained from eye movement subjective experiments. Then the corresponding regions of the image were adjusted based on the new weighted coefficient to enhance the image, and the objective image enhancement quality assessment algorithm based on visual perception was used for the evaluation. The objective evaluation algorithm based on visual perception showed that by adjusting the proportions of the Itti objective model and eye movement subjective model through changing the value of parameter λ, different enhancement results could be achieved. When //=0.7 m the proposed method gave better image enhancement effect than that based on the Itti objective model and eye movement subjective model. Image enhancement based on combined subjective and objective human visual characteristics was better than the traditional image enhancement algorithms, and showed satisfactory visual effect.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28