检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]上海大学通信与信息工程学院,上海200072
出 处:《中国图象图形学报》2006年第11期1614-1618,共5页Journal of Image and Graphics
基 金:国家自然科学基金资助项目(60572127);上海市教委发展基金项目(05AZ43);上海市科委科技攻关项目(055115008);上海高校选拔培养优秀青年教师科研专项基金项目(2006)
摘 要:在压缩域内直接分割运动对象对于有实时要求的应用而言是十分必要的,H.264以其优越的压缩效率已经在许多应用中逐渐取代了MPEG-2/4,但有关在H.264压缩域内进行运动对象分割的研究还很少。为此提出了一种从H.264压缩域实时分割运动对象的算法,该算法首先对从H.264视频中提取出的原始运动矢量场进行时域和空域的归一化,接着通过对连续多帧的运动矢量场进行累积来增强显著的运动信息;然后对累积运动矢量场进行全局运动补偿,同时利用快速的统计区域生长算法按照运动相似性将其分割成多个区域;最后利用运动矢量场的方向角直方图来判断出属于运动对象的分割区域,以组成运动对象。通过对多个MPEG-4测试序列的实验结果表明,该方法不仅能够从H.264压缩域中实时地分割出运动对象,且具有良好的分割质量。Moving object segmentation in the compressed domain is ahsolutely necessary for real-time applications. Due to the predominant compression efficiency, the emerging video coding standard H. 264 is replacing MPEG-2/4 in many multimedia applications, but moving object segmentation in the H. 264 compressed domain is rarely investigated until now. In this paper, we present a new approach to segment moving objects from the H. 264 compressed domain. The motion vector (MV) field extracted from the H. 264 compressed video is first normalized in both temporal and spatial domain, and the MV fields of several continuous frames are accumulated to enhance the salient motion. Then the global motion compensation is performed on the accumulated MV field, while the fast statistical region growing algorithm is exploited to segment it into different motion-homogenous regions. Finally, the orientation histogram of the MV field is exploited to determine the moving object regions. Experimental results for several MPEG-4 test sequences demonstrate the proposed approach can segment moving objects from the H. 264 compressed domain in real time, and exhibit good segmentation quality.
分 类 号:TN941.1[电子电信—信号与信息处理] TN911.73[电子电信—信息与通信工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.171