IoT 22(29): e2

Research Article

Superresolution Reconstruction of Magnetic Resonance Images Based on a Nonlocal Graph Network

Download181 downloads
  • @ARTICLE{10.4108/eetiot.v8i29.769,
        author={Yuanhang Li and Xiang Li and Yidan Yan and Chaochao Hu},
        title={Superresolution Reconstruction of Magnetic Resonance Images Based on a Nonlocal Graph Network},
        journal={EAI Endorsed Transactions on Internet of Things},
        volume={8},
        number={29},
        publisher={EAI},
        journal_a={IOT},
        year={2022},
        month={5},
        keywords={Magnetic resonance imaging, Superresolution reconstruction, Nonlocal operation, Nonlocal self-similarity, Graph attention},
        doi={10.4108/eetiot.v8i29.769}
    }
    
  • Yuanhang Li
    Xiang Li
    Yidan Yan
    Chaochao Hu
    Year: 2022
    Superresolution Reconstruction of Magnetic Resonance Images Based on a Nonlocal Graph Network
    IOT
    EAI
    DOI: 10.4108/eetiot.v8i29.769
Yuanhang Li1, Xiang Li1, Yidan Yan1, Chaochao Hu1,*
  • 1: Henan Polytechnic University
*Contact email: Chaos-hu@home.hpu.edu.cn

Abstract

INTRODUCTION: High-resolution (HR) medical images are very important for doctors when diagnosing the internal pathological structures of patients and formulating precise treatment plans. OBJECTIVES: Other methods of superresolution cannot adequately capture nonlocal self-similarity information of images. To solve this problem, we proposed using graph convolution to capture non-local self-similar information. METHODS: This paper proposed a nonlocal graph network (NLGN) to perform single magnetic resonance (MR) image SR. Specifically, the proposed network comprises a nonlocal graph module (NLGM) and a nonlocal graph attention block (NLGAB). The NLGM is designed with densely connected residual blocks, which can fully explore the features of input images and prevent the loss of information. The NLGAB is presented to efficiently capture the dependency relationships among the given data by merging a nonlocal operation (NL) and a graph attention layer (GAL). In addition, to enable the current node to aggregate more beneficial information, when information is aggregated, we aggregate the neighbor nodes that are closest to the current node. RESULTS: For the scale r=2, the proposed NLGN achieves PSNR of 38.54 dB and SSIM of 0.9818 on the T(T1, BD) dataset, and yielding a 0.27 dB and 0.0008 improvement over the CSN method, respectively. CONCLUSION: The experimental results obtained on the IXI dataset show that the proposed NLGN performs better than the state-of-the-art methods.