首页 | 官方网站   微博 | 高级检索  
     


Self-attention Hypergraph Pooling Network
Authors:Yingfu Zhao  Fusheng Jin  Ronghua Li  Hongchao Qin  Peng Cui  Guoren Wang
Affiliation:School of Computer Science and Technology, Beijing Institute of Technology, Beijing 100081, China
Abstract:Recently, Graph Convolutional neural Networks (GCNs) have attracted much attention by generalizing convolutional neural networks to graph data, which includes redefining convolution and pooling operations on graphs. Due to the limitation that graph data can only focus on dyadic relations, it cannot perform well in real practice. In contrast, a hypergraph can capture high-order data interaction and is easy to deal with complex data representation using its flexible hyperedges. However, the existing methods for hypergraph convolutional networks are still not mature, and there is no effective operation for hypergraph pooling currently. Therefore, a hypergraph pooling network with a self-attention mechanism is proposed. Using a hypergraph structure for data modeling, this model can learn node hidden features with high-order data information through hypergraph convolution operation which introduces a self-attention mechanism, select important nodes both on structure and content through hypergraph pooling operation, and then obtain more accurate hypergraph representation. Experiments on text classification, dish classification, and protein classification tasks show that the proposed method outperforms recent state-of-the-art methods.
Keywords:hypergraph  Convolutional Neural Network (CNN)  pooling  Graph Convolutional neural Network (GCN)  hypergraph neural network
点击此处可从《》浏览原始摘要信息
点击此处可从《》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号