Deep global-attention based convolutional network with dense connections for text classification |
| |
Authors: | Tang Xianlun Chen Yingjie Xu Jin Yu Xinxian |
| |
Affiliation: | 1. School of Automation, Chongqing University of Posts and Telecommunications, Chongqing 400065, China 2. School of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Chongqing 400065, China |
| |
Abstract: | Text classification is a classic task innatural language process (NLP). Convolutional neural networks (CNNs) have demonstrated its effectiveness in sentence and document modeling. However, most of existing CNN models are applied to the fixed-size convolution filters, thereby unable to adapt different local interdependency. To address this problem, a deep global-attention based convolutional network with dense connections (DGA-CCN) is proposed. In the framework, dense connections are applied to connect each convolution layer to each of the other layers which can accept information from all previous layers and get multiple sizes of local information. Then the local information extracted by the convolution layer is reweighted by deep global-attention to obtain a sequence representation with more valuable information of the whole sequence. A series of experiments are conducted on five text classification benchmarks, and the experimental results show that the proposed model improves upon the state of-the-art baselines on four of five datasets, which can show the effectiveness of our model for text classification. |
| |
Keywords: | |
|
| 点击此处可从《中国邮电高校学报(英文版)》浏览原始摘要信息 |
|
点击此处可从《中国邮电高校学报(英文版)》下载全文 |
|