首页 | 官方网站   微博 | 高级检索  
     

结合注意力与卷积神经网络的中文摘要研究
引用本文:周才东,曾碧卿,王盛玉,商齐.结合注意力与卷积神经网络的中文摘要研究[J].计算机工程与应用,2019,55(8):132-137.
作者姓名:周才东  曾碧卿  王盛玉  商齐
作者单位:华南师范大学 计算机学院,广州,510631;华南师范大学 计算机学院,广州 510631;华南师范大学 软件学院,广东 佛山 528225
基金项目:国家重点研发计划;江苏省自然科学基金;江苏省"青蓝工程"项目
摘    要:目前深度学习已经广泛应用于英文文本摘要领域,但是在中文文本摘要领域极少使用该方法进行研究。另外,在文本摘要领域主要使用的模型是编码-解码模型,在编码时输入的是原始的文本信息,缺乏对文本高层次特征的利用,导致编码的信息不够充分,生成的摘要存在词语重复、语序混乱等问题。因此,提出一种局部注意力与卷积神经网络结合的具备高层次特征提取能力的编码-解码模型。模型通过局部注意力机制与卷积神经网络结合的方式提取文本的高层次的特征,将其作为编码器输入,此后通过基于全局注意力机制的解码器生成摘要。实验结果证明,在中文文本数据集上该模型相对于其他模型有着较好的摘要效果。

关 键 词:文本摘要  神经网络  注意力机制

Chinese Summarization Research on Combination of Local Attention and Convolutional Neural Network
ZHOU Caidong,ZENG Biqing,WANG Shengyu,SHANG Qi.Chinese Summarization Research on Combination of Local Attention and Convolutional Neural Network[J].Computer Engineering and Applications,2019,55(8):132-137.
Authors:ZHOU Caidong  ZENG Biqing  WANG Shengyu  SHANG Qi
Affiliation:1.School of Computer, South China Normal University, Guangzhou 510631, China 2.School of Software, South China Normal University, Foshan, Guangdong 528225, China
Abstract:At present, deep learning has been widely applied in the field of English text summarization, but it is rarely used in the field of Chinese text summarization. In addition, the model mainly used in the field of text summarization is the encoder-decoder model, inputting original text information in the encoder, lacking use of advanced features of the text, resulting in inadequate encode information, repetition of the generated abstracts, word order disorder and other issues. Therefore, this paper proposes an encoder-decoder model with high-level feature extraction capability that combines local attention with convolutional neural network. The model extracts the advanced features of the text by means of combining the local attention mechanism and the convolutional neural network, which are used as the input of the encoder, and then the summary is generated by the decoder based on the global attention mechanism. Experiments on Chinese text datasets prove that this model has good performance in Chinese summarization.
Keywords:text summary  neural network  attention mechanism  
本文献已被 维普 万方数据 等数据库收录!
点击此处可从《计算机工程与应用》浏览原始摘要信息
点击此处可从《计算机工程与应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号