检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]宁波大学信息科学与工程学院,浙江宁波315211
出 处:《光学技术》2016年第4期351-356,共6页Optical Technique
基 金:国家自然科学基金项目(U1301257;61271270;61311140262);国家科技支撑计划项目(2012BAH67F01)
摘 要:考虑到人眼视觉关注特性在视频质量评价(VQA)中所具有的重要作用,提出了一种结合人眼视觉关注特性的视频质量评价方法。首先利用三维Sobel算子以及恰可察觉失真模型得到全局显著图,对全局显著图的每个显著像素点构建结构张量来求取一帧的全局质量;然后利用视频运动信息以及人眼中心关注特性求得局部显著图来进行感知加权,得到一帧的局部质量;最后均衡局部与全局质量得到视频中一帧的质量,并采用机器学习的方法获得时域加权模型,对视频帧进行加权,从而得到客观视频质量评价值。在LIVE视频数据库上进行性能测试,得到PLCC(Pearson Linear Correlation Coefficient)为0.827,SROCC(Spearman Rank Order Correlation Coefficient)为0.802,与已有相关算法相比,所提出的VQA方法的评价结果更接近人眼的主观感知。Human visual gaze characteristics plays an important role in video quality assessment. A video quality as- sessment metric combining with human visual characteristics is proposed based on the human visual gaze characteristics. Firstly, global salient regions are extracted by the three dimensional Sobel operator and the just-noticeable-difference model. After obtaining the salient areas, the global quality of one frame is calculated from all the salient pixels. Second- ly, using the motion vector and the gaze centering map to get the local salient areas, the local quality is obtained from these areas. Then the total quality of one frame is determined by integrating the global quality with the local quality. Fi- nally, the temporal pooling model by using the machine learning method is established, then the quality of the video is ob rained by adopting the model to weight the frame scores. The proposed metric is tested on the LIVE subjective quality video database. The experimental results show that the metric proposed can perform better and its evaluation results are more consistent with subjective perception compared with other classical methods.
关 键 词:视频质量评价 人眼感知特性 恰可察觉失真模型 显著图 结构张量 机器学习
分 类 号:TN391.4[电子电信—物理电子学]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.229