首页 | 官方网站   微博 | 高级检索  
     

朝鲜语语音音节自动切分算法的研究
引用本文:李洺宇,金小峰.朝鲜语语音音节自动切分算法的研究[J].延边大学理工学报,2019,0(2):128-135.
作者姓名:李洺宇  金小峰
作者单位:( 延边大学 工学院, 吉林 延吉 133002 )
摘    要:针对目前语音语料人工标注效率低的问题,提出了一种朝鲜语连续语音语料的音节自动切分方法.该方法首先采用Seneff听觉模型提取音频的包络检测响应和广义同步检测响应等特征参数,其次结合朝鲜语发音特点确定音节的候选边界位置,最后通过静音段和摩擦音检测消除虚假边界,以提高边界检测的准确率.实验结果表明,该朝鲜语语音语料音节自动切分方法的准确率(93.56%)比传统的基于Seneff听觉模型的分割算法提高了14.59%,召回率(86.43%)比传统的基于Seneff听觉模型的分割算法降低了1.69%; 因此,本文算法总体优于传统的基于Seneff听觉模型的分割算法.

关 键 词:朝鲜语语音语料  语料自动标注  Seneff听觉模型  语音音节分割

Research on automatic segmentation algorithm of Korean speech syllables
LI Mingyu,JIN Xiaofeng.Research on automatic segmentation algorithm of Korean speech syllables[J].Journal of Yanbian University (Natural Science),2019,0(2):128-135.
Authors:LI Mingyu  JIN Xiaofeng
Affiliation:( College of Engineering, Yanbian University, Yanji 133002, China )
Abstract:Aiming at the current low efficiency of manual annotation of speech corpus, an automatic syllable segmentation method for Korean continuous speech corpus is proposed. First, Seneff auditory model is used to extract the audio characteristic parameters, such as the envelope detection response and generalized synchronous detection response, etc. Secondly, the candidate boundary position of syllables is defined according to the Korean pronunciation characteristics. Finally, the false boundary is eliminated by silent segment and fricative detection to improve the boundary detection accuracy. The experimental results show that the accuracy of the proposed Korean syllable segmentation method is 93.56%, increased by 14.59% than that of traditional segmentation algorithms based on Sneff auditory model, meanwhile, the recall rate reaches to 86.43%, decreased by 1.69%. Therefore, the proposed algorithm in this paper is overall superior to traditional segmentation algorithms based on Sneff auditory model.
Keywords:Korean speech corpus  automatic segmentation  Seneff auditory model  syllable segmentation
本文献已被 CNKI 等数据库收录!
点击此处可从《延边大学理工学报》浏览原始摘要信息
点击此处可从《延边大学理工学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号