检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]哈尔滨工业大学计算机科学与技术学院,哈尔滨150001 [2]青岛科技大学信息学院,青岛266042
出 处:《高技术通讯》2010年第5期473-480,共8页Chinese High Technology Letters
基 金:863计划(2006AA010103);国家自然科学基金(60672163)资助项目
摘 要:针对现有混淆网络生成方法难以兼顾速度和质量的问题,研究了基于横断一致性的Lattice分段方法和基于最大置信度的Lattice分段方法,研究了用这两种Lattice分段方法来减少对混淆网络质量的影响。提出了一种基于Lattice分段的高质量混淆网络快速生成方法。该方法把原始大规模Lattice分割成小尺寸的Lattice,分别生成混淆网络,从而可减小计算规模,提高网络生成速度。同时通过分段数目来调节速度和质量之间的平衡。实验结果显示,与词聚类算法相比,所提方法显著提高了混淆网络的生成速度,而对混淆网络质量影响很小。从解码性能看,在相同速度下所提方法获得了比采用剪枝的词聚类算法更低的错误率。Aimed at the problem that the existing confusion network generating methods cannot keep a tmdeoff between the network generation speed and the quality of confusion network, the paper investigates two major lattice segmentation methods with the purpose of using them to reduce the impacts of segmentation to the quality of confusion networks, and based on this, pre^ents a high-quality method for fast generating confusion networks based on lattice segmentation. The method segments the large-scale lattice from automatic speech recognition (ASR) into sequences of smaller sub-lattices and then generates the confusion networks from these sub-lattices, thus remarkably decreasing the computation scale and increasing the network generating speed. The balance between the generation speed and the network quality is controlled by the segmentation number. The experimental results show that the proposed method can significantly improve the speed of confusion network generation while hold almost the same quality compared with the traditional word-clustering method without lattice segmentation. At the same speed, the proposed method can obtain a lower tonal syllable error rate than the word- clustering method with lattice pruning.
关 键 词:混淆网络(CN) LATTICE 多候选 语音识别
分 类 号:TP393.02[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.31