About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Mobile and Ubiquitous Systems: Computing, Networking and Services. 20th EAI International Conference, MobiQuitous 2023, Melbourne, VIC, Australia, November 14–17, 2023, Proceedings, Part II

Research Article

Quality Evaluation of Image Segmentation in Mobile Augmented Reality

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-63992-0_27,
        author={Shneka Muthu Kumara Swamy and Qi Han},
        title={Quality Evaluation of Image Segmentation in Mobile Augmented Reality},
        proceedings={Mobile and Ubiquitous Systems: Computing, Networking and Services. 20th EAI International Conference, MobiQuitous 2023, Melbourne, VIC, Australia, November 14--17, 2023, Proceedings, Part II},
        proceedings_a={MOBIQUITOUS PART 2},
        year={2024},
        month={7},
        keywords={Augmented Reality Quality of Segmentation Neural Network for Mobile Devices},
        doi={10.1007/978-3-031-63992-0_27}
    }
    
  • Shneka Muthu Kumara Swamy
    Qi Han
    Year: 2024
    Quality Evaluation of Image Segmentation in Mobile Augmented Reality
    MOBIQUITOUS PART 2
    Springer
    DOI: 10.1007/978-3-031-63992-0_27
Shneka Muthu Kumara Swamy,*, Qi Han
    *Contact email: smuthukumaraswamy@mines.edu

    Abstract

    Mobile Augmented Reality (AR) facilitates a seamless interactive experience between actual and virtual environments. AR employs segmented images for various purposes such as object recognition, occlusion boundary estimation, and foreground-background separation. However, evaluating the quality of segmented images in mobile AR is challenging due to the limited resources of mobile devices. Existing solutions employ neural networks with many layers, making it difficult to deploy them on mobile devices. To address this issue, we propose techniques to modify the inputs so that we can reduce the number of layers in the neural network, making it possible to deploy for mobile devices. This idea is incorporated into our proposed SegQNet. It utilizes deep learning techniques based on convolutional neural networks (CNNs) to evaluate the quality of overlaid segmentation in mobile AR. SegQNet achieves high accuracy without the need for ground-truth images or expensive computations. Our experiments on Android smartphones demonstrate that SegQNet outperforms two state-of-the-art methods without incurring significant overhead.

    Keywords
    Augmented Reality Quality of Segmentation Neural Network for Mobile Devices
    Published
    2024-07-19
    Appears in
    SpringerLink
    http://dx.doi.org/10.1007/978-3-031-63992-0_27
    Copyright © 2023–2025 ICST
    EBSCOProQuestDBLPDOAJPortico
    EAI Logo

    About EAI

    • Who We Are
    • Leadership
    • Research Areas
    • Partners
    • Media Center

    Community

    • Membership
    • Conference
    • Recognition
    • Sponsor Us

    Publish with EAI

    • Publishing
    • Journals
    • Proceedings
    • Books
    • EUDL