首页 | 官方网站   微博 | 高级检索  
     

一种双路网络语义分割模型
引用本文:杨运龙,梁路,滕少华.一种双路网络语义分割模型[J].广东工业大学学报,2022,39(1):63-70.
作者姓名:杨运龙  梁路  滕少华
作者单位:广东工业大学 计算机学院,广东 广州 510006
基金项目:国家自然科学基金资助项目(61972102,61603100)
摘    要:深度卷积神经网络对高分辨率遥感影像进行语义分割时, 对图像的下采样会造成物体边缘模糊, 使分割结果在边缘附近划分不清晰, 误分类较多。通过在网络中增加边缘信息可以提升模型对遥感图像的分割能力。因此, 提出了一个用于语义分割的双路网络模型, 增加一路边缘网络学习目标的边缘特征, 并利用边缘特征对分割特征进行细化。同时, 作为一个多任务学习模型, 分割网络和边缘网络可以同时进行训练。本文在ISPRS Potsdam和ISPRS Vaihingen数据集上证明了双路网络模型的有效性, 对比多种语义分割模型, 均取得了领先的效果。

关 键 词:双路网络  边缘检测  高分辨率遥感图像  语义分割  
收稿时间:2021-01-25

A Two-way Network Model for Semantic Segmentation
Yang Yun-long,Liang Lu,Teng Shao-hua.A Two-way Network Model for Semantic Segmentation[J].Journal of Guangdong University of Technology,2022,39(1):63-70.
Authors:Yang Yun-long  Liang Lu  Teng Shao-hua
Affiliation:School of Computer Science and Technology, Guangdong University of Technology, Guangzhou 510006, China
Abstract:When deep convolutional neural networks perform semantic segmentation of high-resolution remote sensing images, the downsampling of images can cause blurring of object edges, making the segmentation results unclearly delineated near the edges and more misclassified. The segmentation ability of the model for remote sensing images can be improved by adding edge information in the network. Therefore, a two-way network model is proposed for semantic segmentation, adding one-way edge network to learn the edge features of the target and refining the segmentation features using the edge features. Meanwhile, the segmentation network and the edge network can be trained simultaneously as a multi-task learning model. In this paper, the effectiveness of the two-way network model is demonstrated on the ISPRS Potsdam and ISPRS Vaihingen dataset datasets, and leading results are achieved when comparing multiple semantic segmentation models.
Keywords:two-way network  edge detection  high-resolution remote sensing images  semantic segmentation  
本文献已被 万方数据 等数据库收录!
点击此处可从《广东工业大学学报》浏览原始摘要信息
点击此处可从《广东工业大学学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号