检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张宁[1] 李忠健[1] 潘如如[1] 高卫东[1] 韩要宾
机构地区:[1]生态纺织教育部重点实验室(江南大学),江苏无锡214122
出 处:《纺织学报》2017年第5期37-42,共6页Journal of Textile Research
基 金:教育部博士点基金项目(20120093130001);国家博士后基金项目(2013M541602);江苏省博士后基金项目(1301075C);2014江苏省研究生创新计划(KYLX_1132);江苏高校优势学科建设工程资助项目(苏政办发[2014]37号)
摘 要:为解决色织物产品设计周期长,试织打样耗时费力的问题,提出一种采用色纺纱图像的真实感色织物的模拟方法。首先采集彩色纱线图像,运用阈值分割、形态学处理得到纱线主体,获取纱线主体的上、下边界和中心线,进而得到原始纱线图像的主体部分;接着根据椭圆模型和正弦曲线模型对纱线主体图像进行处理,得到纱线在二维织物表面中的形态;最后根据色纱循环和织物组织变换模型来改变经纬纱的覆盖关系,实现了真实感条纹型和格子型色织物的模拟。模拟结果表明:本文算法能够模拟不同种类色织物的织造过程,真实地反映织物的外观效果,且能够调整织物组织和色纱循环参数,提高了现有模拟算法的真实性和适应性。In order to solve the problem of long product design cycle and proofing time-consuming in yarn-dyed fabric, a new method was proposed for simulation of realistic yarn-dyed fabric using colored spun yarn images. Firstly, colored yarn images were capture and processed by threshold segmentation and morphological processing to obtain the yarn body. After that, the upper and lower boundaries and the center line of the yarn body were found, and then the main part of the original yarn image was obtained; Secondly, in ouder the shape of yarn on the surface of two dimensional fabric, the ellipse model and the sine curve model was applied to process image of the yarn main body. Finally, the cover relation of warp and weft yarn according to loops of dyed yarn and transformational model of fabric texture waschanged, realizing simulation of realistic striped and checked yarn-dyed fabric. Simulation results show that the proposed algorithm can simulation weaving process of different kinds of yarn-dyed fabric, and truly reflect the appearance of fabric. Parameters of loops of dyed yarn and fabric texture can be adjusted, improved the authenticity and adaptability of the present simulation algorithm.
分 类 号:TS101.9[轻工技术与工程—纺织工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.3