首页 | 官方网站   微博 | 高级检索  
     

基于循环神经网络语言模型的N-best重打分算法
引用本文:张剑,屈丹,李真.基于循环神经网络语言模型的N-best重打分算法[J].数据采集与处理,2016,31(2):347-354.
作者姓名:张剑  屈丹  李真
作者单位:解放军信息工程大学信息系统工程学院,郑州,450002
摘    要:循环神经网络语言模型能够克服统计语言模型中存在的数据稀疏问题,同时具有更强的长距离约束能力,是一种重要的语言模型建模方法。但在语音解码时,由于该模型使词图的扩展次数过多,造成搜索空间过大而难以使用。本文提出了一种基于循环神经网络语言模型的N-best重打分算法,利用N-best引入循环神经网络语言模型概率得分,对识别结果进行重排序,并引入缓存模型对解码过程进行优化,得到最优的识别结果。实验结果表明,本文方法能够有效降低语音识别系统的词错误率。

关 键 词:语音识别  语言模型  循环神经网络  N-best重打分  缓存语言模型

N-best Rescoring Algorithm Based on Recurrent Neural Network Language Model
Zhang Jian,Qu Dan,Li Zhen.N-best Rescoring Algorithm Based on Recurrent Neural Network Language Model[J].Journal of Data Acquisition & Processing,2016,31(2):347-354.
Authors:Zhang Jian  Qu Dan  Li Zhen
Affiliation:Institute of Information Systems Engineering, PLA Information Engineering University, Zhengzhou, 450002, China
Abstract:Recurrent neural network language model (RNNLM) is an important method in statistical language models because it can tackle the data sparseness problem and contain a longer distance constraints. However, it lacks practicability because the lattice has to expand too many times and explode the sea rch space. Therefore, a N-best rescoring algorithm is proposed which uses the RNNLM to rerank the recognition results and optimize the decoding process. Experimental results show that the proposed method can effectively reduce the word error rate of the speech recognition system.
Keywords:speech recognition  language model  recurrent neural network  N-best rescoring    cache language model
点击此处可从《数据采集与处理》浏览原始摘要信息
点击此处可从《数据采集与处理》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号