About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Wireless Mobile Communication and Healthcare. 10th EAI International Conference, MobiHealth 2021, Virtual Event, November 13–14, 2021, Proceedings

Research Article

CANet: Compact Attention Network for Automatic Melanoma Segmentation

Download(Requires a free EAI acccount)
3 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-06368-8_22,
        author={Yingyan Hou and Kaichuang Liu},
        title={CANet: Compact Attention Network for Automatic Melanoma Segmentation},
        proceedings={Wireless Mobile Communication and Healthcare. 10th EAI International Conference, MobiHealth 2021, Virtual Event, November 13--14, 2021, Proceedings},
        proceedings_a={MOBIHEALTH},
        year={2022},
        month={6},
        keywords={Melanoma segmentation Atrous convolution Attention mechanism},
        doi={10.1007/978-3-031-06368-8_22}
    }
    
  • Yingyan Hou
    Kaichuang Liu
    Year: 2022
    CANet: Compact Attention Network for Automatic Melanoma Segmentation
    MOBIHEALTH
    Springer
    DOI: 10.1007/978-3-031-06368-8_22
Yingyan Hou1,*, Kaichuang Liu2
  • 1: School of Software, Tsinghua University
  • 2: Department of Automation, Tsinghua University
*Contact email: hyy20@mails.tsinghua.edu.cn

Abstract

In the study of skin cancer, particularly melanoma, automatic and accurate segmentation as a crucial step in Computer-Aided Diagnosis (CAD) provides a reliable basis for clinical diagnosis efficiency and pathology research. However, due to the variability of skin lesions in texture, shape, and complex boundaries, automatic and accurate segmentation is still an unsolved challenge. In this paper, we propose a new automatic segmentation network for melanoma segmentation named Compact Attention Network (CANet). Based on the fully convolutional networks, the CANet removes down-sampling so as not to reduce the spatial accuracy. The CANet expands the receptive field by the designed atrous convolution, which could avoid the gridding issue. In order to refine the feature map information and make the segmentation edge smoother, we add an attention module after every designed atrous convolution. Finally, our model achieves State-of-the-Art (SOTA) performance in the task of melanoma segmentation compared with U-net, SegNet, FrCN, and so on. We conduct ablation experiments to prove the effectiveness of each element of the network. Our results show that the melanoma segmentation of the CANet is 91.7% in Sensitivity and 90.7% in Dice scores for the International Skin Imaging Collaboration (ISIC) test dataset. The CANet outperforms FrCN, U-Net, SegNet, Mask R-CNN and nnU-Net in Dice by 3.6%, 14.5%, 8.6%, 5.4% and 1.7% respectively, exhibiting better performance than these classic networks. The CANet can perform medical image segmentation more accurately and quickly, provide an important reference for medical workers in diagnosing diseases, and improve diagnosis efficiency.

Keywords
Melanoma segmentation Atrous convolution Attention mechanism
Published
2022-06-07
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-06368-8_22
Copyright © 2021–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL