检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:高银花[1] 陈进[2] 季霞[1] GAO Yinhua;CHEN Jin;JI Xia(Nanjing Audit University,Nanjing 211815,China;Nanjing Normal University,Nanjing 210023,China)
机构地区:[1]南京审计大学,南京211815 [2]南京师范大学,南京210023
出 处:《激光杂志》2022年第11期204-209,共6页Laser Journal
基 金:国家自然科学基金项目(No.U1831127);南京审计大学研究课题(No.2021JG066)。
摘 要:利用当前方法对光照场景进行三维建模时,未提取并匹配光照场景的特征点,存在建模效果差、建模精度低的问题。为此,提出虚拟现实技术的光照场景三维建模方法。首先,检测光照场景平面图的特征点,并进行特征点定义;其次,通过特征点匹配获得其在三维空间中的关系,采用虚拟现实技术对光照场景进行扫描,获得光照场景点云数据;然后,根据特征点在三维空间中的关系,确定点云数据的中心点,构建椭圆球体,将点云进行分解后生成逼近光照场景的曲面;最后,对曲面进行加权拟合处理从而生成光照场景三维曲面,实现光照场景三维建模。实验结果表明,所提方法的建模效果好、建模精度高。When using the current method to model the lighting scene, the feature points of the lighting scene are not extracted and matched, which has the problems of poor modeling effect and low modeling accuracy.Therefore, a 3D modeling method of illumination scene based on virtual reality technology is proposed.Firstly, the feature points of the lighting scene plan are detected and defined.Secondly, the relationship in three-dimensional space is obtained by feature point matching, and the illumination scene is scanned by virtual reality technology to obtain the illumination scene point cloud data.Then, according to the relationship of feature points in three-dimensional space, the center point of point cloud data is determined, an elliptical sphere is constructed, and the point cloud is decomposed to generate a surface approaching the lighting scene.Finally, the weighted fitting of the surface is carried out to generate the three-dimensional surface of the lighting scene and realize the three-dimensional modeling of the lighting scene.The experimental results show that the proposed method has good modeling effect and high modeling accuracy.
关 键 词:虚拟现实技术 特征点提取 特征点匹配 三维建模 曲面拟合
分 类 号:TN929.11[电子电信—通信与信息系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222