首页 | 官方网站   微博 | 高级检索  
     

融合多注意力机制的脊椎图像分割方法
引用本文:普钟,张俊华,黄昆,周奇浩.融合多注意力机制的脊椎图像分割方法[J].计算机应用研究,2023,40(4):1256-1262.
作者姓名:普钟  张俊华  黄昆  周奇浩
作者单位:云南大学信息学院,云南大学信息学院,云南大学信息学院,云南大学信息学院
基金项目:国家自然科学基金资助项目(62063034)
摘    要:针对脊椎CT、MR图像分割模型分割性能不高的问题,基于U型网络提出了脊椎分割网络MAU-Net。首先引入坐标注意力模块,使网络准确捕获到空间位置信息,并嵌入到通道注意力中;然后提出基于Transformer的双支路通道交叉融合模块代替跳跃连接,进行多尺度特征融合;最后提出特征融合注意力模块,更好地融合Transformer与卷积解码器的语义差异。在脊柱侧凸CT数据集上,Dice达到0.929 6,IoU达到0.859 7。在公开MR数据集SpineSagT2Wdataset3上,与FCN相比,Dice提高14.46%。实验结果表明,MAU-Net能够有效减少椎骨误分割区域。

关 键 词:脊椎图像分割  U型网络  坐标注意力  双支路通道Transformer  Transformer-Convolution融合注意力
收稿时间:2022/7/21 0:00:00
修稿时间:2023/3/13 0:00:00

Spinal image segmentation method with multi-attention
puzhong,zhangjunhu,huangkun and zhouqihao.Spinal image segmentation method with multi-attention[J].Application Research of Computers,2023,40(4):1256-1262.
Authors:puzhong  zhangjunhu  huangkun and zhouqihao
Affiliation:School of Information, Yunnan University,,,
Abstract:Considering the existing spinal computer tomography(CT) and magnetic resonance(MR) image segmentation models have limitations in segmentation performance, this paper proposed a spinal segmentation method MAU-Net based on U-shaped network. Fristly, this paper introduced coordinate attention module into the encoder of U-shaped network, which enabled the network accurately capture the spatial position information and embed it into the channel attention. Secondly, this paper proposed dual-branch channel cross fusion module based on Transformer, it could replace the skip connection for multi-scale feature fusion. Finally, this paper proposed a feature fusion attention module to better fuse the semantic differences between Transformer and convolution network. On scoliosis CT dataset, Dice reached 0.929 6, IoU reached 0.859 7. On the public MR dataset SpineSagT2Wdataset3, compared with FCN, Dice improved by 14.46%. Experimental results show that this method can effectively reduce the false segmentation area of vertebrae.
Keywords:spine image segmentation  U-shaped network  coordinate attention  dual-branch channel Transformer  Transformer-Convolution fusion attention
点击此处可从《计算机应用研究》浏览原始摘要信息
点击此处可从《计算机应用研究》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号