首页 | 官方网站   微博 | 高级检索  
     

面向特定方面情感分析的图卷积过度注意(ASGCN-AOA)模型
引用本文:夏鸿斌,顾艳,刘渊.面向特定方面情感分析的图卷积过度注意(ASGCN-AOA)模型[J].中文信息学报,2022,36(3):146-153.
作者姓名:夏鸿斌  顾艳  刘渊
作者单位:1.江南大学 人工智能与计算机学院,江苏 无锡 214122;
2.江苏省媒体设计与软件技术重点实验室,江苏 无锡 214122
基金项目:国家自然科学基金(61672264)
摘    要:针对注意力机制与卷积神经网络模型在方面级情感分析研究中,无法发掘句中长距离单词与相关句法约束间依存关系,而将与语法无关的上下文单词作为方面情感判断线索的问题,该文提出了一种结合图卷积网络(GCN)和注意-过度注意(AOA)神经网络的方面级情感分类模型(ASGCN-AOA).首先,采用双向长短时记忆网络来对上下文词之间特...

关 键 词:自然语言理解  图卷积网络  长短时记忆网络  注意-过度注意神经网络  人工智能

Graph Convolution Overattention (ASGCN-AOA) Model for Specific Aspects of Sentiment Analysis
XIA Hongbin,GU Yan,LIU Yuan.Graph Convolution Overattention (ASGCN-AOA) Model for Specific Aspects of Sentiment Analysis[J].Journal of Chinese Information Processing,2022,36(3):146-153.
Authors:XIA Hongbin  GU Yan  LIU Yuan
Affiliation:1.School of Artificial Intelligence and Computer, Jiangnan University, Wuxi, Jiangsu 214122, China;
2.Jiangsu Key Laboratory of Media Design and Software Technology, Wuxi, Jiangsu 214122,China
Abstract:Aiming at the problem that attention mechanism and convolutional neural network model cannot explore the dependencies between long-distance words and related syntactic constraints in a sentence in aspect-level sentiment analysis research, and the context words unrelated to grammar are used as aspect sentiment judgment clues. This paper proposes an aspect-level emotion classification model (ASGCN-AOA) combining graph convolution network (GCN) and attention-over-attention (AOA) neural network. Firstly, a bidirectional long short-term memory network is used to model the aspect-specific representations between context words. Then, in the dependency tree of each sentence, the corresponding graph convolution network (GCN) is established to obtain the aspect feature of considering syntactic dependence and long-distance multi-word relationship simultaneously. Finally, the AOA attention mechanism captures the interaction and representation between aspect words and context sentences, and automatically pays attention to important parts of sentences. Experiments were carried out on five data sets: Twitter, Lap14, Rest14, Rest15 and Rest16. Accuracy and Macro-F1 indicators were used to evaluate. Experimental results show that the model presented in this paper is significantly improved compared with other related aspect-based analysis algorithms.
Keywords:natural language understanding  graph convolution network  long short-term memory network  attention-over-attention neural network  artificial intelligence  
点击此处可从《中文信息学报》浏览原始摘要信息
点击此处可从《中文信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号