检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陶霖密[1] 于亚鹏[1] 邸慧军[1] 孙洛[1]
机构地区:[1]清华大学计算机系,北京100084
出 处:《中国图象图形学报》2012年第9期1150-1157,共8页Journal of Image and Graphics
基 金:国家自然科学基金项目(60873266;90820304)
摘 要:人体定位一直是计算机视觉领域的热点课题之一,是人机交互的基础。提出多摄像机条件下的投影关系的两类几何约束,及基于几何约束的人体定位方法。人体定位算法的精度计算依赖于实验中人体在世界坐标系中精确位置的获取,而这一直是计算机视觉实验中的难点。本文提出基于虚拟数据的新的人体定位实验方法,可以方便准确地获取到人体在世界坐标系的位置。由于几何约束的完备性,定位过程中无需对摄像机的位置和姿态进行假设,也不需要限制人体在场景中的位置(遮挡关系)、行为等。采用虚拟数据与真实数据相结合,可以更准确地验证方法的精确度和适应性。实验结果表明,本文算法运行稳定且定位精度高。在定位结果的基础上,又进一步给出了人体在世界坐标系中的运动轨迹。Locating people has been one of the most popular subjects in computer vision, which is the basis for human-computer interaction. Locating people and motion tracking is the inverse problem of multi-camera imaging, which is gaining videos from one or more cameras, to compute the location or motion of people in videos. In this paper, we propose two types of geometrical constraints for multi-camera approaches of locating people. The accuracy of people locating algorithms relies on the exact location in the world coordinate system during experiments, which has been a great difficulty in computer vision experiments. In this paper, we present a new people locating experimental method based on virtual data, from which we can more easily get the accurate location of people in the world coordinate system. Because of the completeness of the geometric constraints, we do not need to assume the cameras' placement and restrict the location (occlusion) or motion of the people in the scene. Using virtual and real data in the experiments, it can be more accurate for verifying the accuracy and adaptability of the algorithm. The experimental results show that the algorithm works stable and accurate. Addition, the trajectory of the people can be acquired based on the results.
分 类 号:TP301.6[自动化与计算机技术—计算机系统结构]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7