检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Zane K.J.Hartley Aaron S.Jackson Michael Pound Andrew P.French
机构地区:[1]School of Computer Science,University of Nottingham,NG71BB,UK [2]School of Biosciences,University of Nottingham,LE125RD,UK
出 处:《Plant Phenomics》2021年第1期326-336,共11页植物表型组学(英文)
基 金:the Engineering and Physical Sciences Research Council[EP/R513283/1]awarded to Zane K.J.Hartley。
摘 要:3D reconstruction of fruit is important as a key component of fruit grading and an important part of many size estimation pipelines.Like many computer vision challenges,the 3D reconstruction task suffers from a lack of readily available training data in most domains,with methods typically depending on large datasets of high-quality image-model pairs.In this paper,we propose an unsupervised domain-adaptation approach to 3D reconstruction where labelled images only exist in our source synthetic domain,and training is supplemented with different unlabelled datasets from the target real domain.We approach the problem of 3D reconstruction using volumetric regression and produce a training set of 25,000 pairs of images and volumes using hand-crafted 3D models of bananas rendered in a 3D modelling environment(Blender).Each image is then enhanced by a GAN to more closely match the domain of photographs of real images by introducing a volumetric consistency loss,improving performance of 3D reconstruction on real images.Our solution harnesses the cost benefits of synthetic data while still maintaining good performance on real world images.We focus this work on the task of 3D banana reconstruction from a single image,representing a common task in plant phenotyping,but this approach is general and may be adapted to any 3D reconstruction task including other plant species and organs.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.117