About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Communications and Networking. 17th EAI International Conference, Chinacom 2022, Virtual Event, November 19-20, 2022, Proceedings

Research Article

DPNet: Depth and Pose Net for Novel View Synthesis via Depth Map Estimation

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-34790-0_23,
        author={Ge Zhu and Yu Liu and Yumei Wang},
        title={DPNet: Depth and Pose Net for Novel View Synthesis via Depth Map Estimation},
        proceedings={Communications and Networking. 17th EAI International Conference, Chinacom 2022, Virtual Event, November 19-20, 2022, Proceedings},
        proceedings_a={CHINACOM},
        year={2023},
        month={6},
        keywords={view synthesis IBR depth map pixels correspondence DPNet},
        doi={10.1007/978-3-031-34790-0_23}
    }
    
  • Ge Zhu
    Yu Liu
    Yumei Wang
    Year: 2023
    DPNet: Depth and Pose Net for Novel View Synthesis via Depth Map Estimation
    CHINACOM
    Springer
    DOI: 10.1007/978-3-031-34790-0_23
Ge Zhu1, Yu Liu1,*, Yumei Wang1
  • 1: Institute of Artificial Intelligence
*Contact email: liuy@bupt.edu.cn

Abstract

Novel view synthesis is regarded as one of the efficient ways to realize stereoscopic vision, which paves the way to virtual reality. Image-based rendering (IBR) is one of the view synthesis strategies, which warps pixels from source views to target views in order to protect low-level details. However, IBR methods predict the pixels correspondence in an unsupervised way and have limits in getting accurate pixels. In this paper, we propose Depth and Pose Net (DPNet) for novel view synthesis via depth map estimation. We introduce two nearby views as implicit supervision to improve the pixels correspondence accuracy. Besides, the depth net firstly predicts the source depth map and then the pose net transforms the source depth map to the target depth map which is used to calculate pixels correspondence. Experimental results show that DPNet generates accurate depth maps and thus synthesizes novel views with higher quality than state-of-the-art methods on the synthetic object and real scene datasets.

Keywords
view synthesis IBR depth map pixels correspondence DPNet
Published
2023-06-10
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-34790-0_23
Copyright © 2022–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL