检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:房颖 徐艺文[1] 赵铁松 FANG Ying;XU Yiwen;ZHAO Tiesong(Fujian Key Lab for Intelligent Processing and Wireless Transmission of Media Information,Fuzhou University,Fuzhou 350001,China)
机构地区:[1]福州大学福建省媒体信息智能处理与无线传输重点实验室,福建福州350001
出 处:《通信学报》2023年第5期42-51,共10页Journal on Communications
基 金:国家自然科学基金资助项目(No.62171134);福建省自然科学基金资助项目(No.2022J02015)。
摘 要:为了精确地传输信号内容含义,实现智能识别与信号重建,针对振动触觉信号,提出了一种面向机器识别-人类感知的联合编码方案。在编码端,将三维振动信号转化为一维信号,采用短时傅里叶变换提取信号的语义信息,并实现语义信息高效压缩与表征。在解码端,基于语义信息采用全卷积神经网络实现触觉的智能识别;同时,将原始信号与基于语义信息的重构信号的残差值作为语义信息的补偿,逐步提高重构信号的质量,满足人类感知需求。实验结果表明,所提方案用较低比特率的语义信息实现触觉识别,同时在满足人类感知需求情况下,触觉数据的压缩效率有所提高。In order to accurately transmit the content meaning of vibrotactile signals and achieve intelligent recognition and signal reconstruction,a joint vibrotactile coding scheme for machine recognition and human perception was proposed.At the encoding end,the original three-dimensional vibrotactile signals were converted into one-dimensional signals.Then the semantic information of the signals was extracted using a short-time Fourier transform before being effectively compressed and transmitted.At the decoding end,a fully convolutional neural network was used to intelligently recognize based on the semantic information.The difference between the original signals and the reconstructed signals based on semantic information was used as compensation for the semantic information,and the quality of the reconstructed signals was gradually improved to meet human perceptual needs.The experimental results show that the proposed scheme achieve tactile recognition with semantic information at a lower bit rate while improving the compression efficiency of tactile data,thus satisfying human perceptual needs.
关 键 词:触觉 语义信息 感知质量 联合编码 智能识别 振动触觉
分 类 号:TP37[自动化与计算机技术—计算机系统结构]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.56