2nd International ICST Conference on Immersive Telecommunications

Research Article

Multi-baseline Disparity Fusion for Immersive Videoconferencing

Download588 downloads
  • @INPROCEEDINGS{10.4108/immerscom.2009.6585,
        author={Oliver Schreer and Nicole Atzpadin and Ingo Feldmann and Peter Kauff},
        title={Multi-baseline Disparity Fusion for Immersive Videoconferencing},
        proceedings={2nd International ICST Conference on Immersive Telecommunications},
        keywords={Immersive telepresence 3D videoconferencing multi-view video disparity analysis.},
  • Oliver Schreer
    Nicole Atzpadin
    Ingo Feldmann
    Peter Kauff
    Year: 2010
    Multi-baseline Disparity Fusion for Immersive Videoconferencing
    DOI: 10.4108/immerscom.2009.6585
Oliver Schreer1,*, Nicole Atzpadin1,*, Ingo Feldmann1,*, Peter Kauff1,*
  • 1: Fraunhofer Institute for Telecommunications/ Heinrich-Hertz-Institut Einsteinufer 37, 10587 Berlin, Germany
*Contact email: Oliver.Schreer@fraunhofer.hhi.de, Nicole.Atzpadin@fraunhofer.hhi.de, Ingo.Feldmann@fraunhofer.hhi.de, Peter.Kauff@fraunhofer.hhi.de


The European FP7 project 3DPresence is developing a multiparty, high-end 3D videoconferencing concept that tackles the problem of transmitting the feeling of physical presence in real-time to multiple remote locations in a transparent and natural way. Traditional set-top camera video-conferencing systems still fail to meet the 'telepresence challenge' of providing a viable alternative for physical business travel, which is nowadays characterized by unacceptable delays, costs, inconvenience, and an increasingly large ecological footprint. Even recent high-end commercial solutions, while partially removing some of these traditional shortcomings, still present the problems of not scaling easily, expensive implementations, not utilizing 3D life-sized representations of the remote participants and addressing only eye contact and gesture-based interactions in very limited ways. One of many challenges in this project is to calculate depth information for many different views in order to synthesise novel views to provide eye contact. In this paper, we present a multi-baseline disparity fusion scheme for improved real-time disparity map estimation. The advantages and disadvantages of different configurations are discussed and theoretical considerations are presented regarding disparity resolution and baseline. These observations together with experimental investigations lead to a multi-baseline configuration that allows taking advantage of small and wide baseline stereo camera as well as trifocal camera configurations.