Toggle Main Menu Toggle Search

Open Access padlockePrints

Monocular segment-wise depth: Monocular depth estimation based on a semantic segmentation prior

Lookup NU author(s): Dr Amir Atapour Abarghouei

Downloads


Licence

This is the authors' accepted manuscript of a conference proceedings (inc. abstract) that has been published in its final definitive form by IEEE, 2019.

For re-use rights please refer to the publisher's terms and conditions.


Abstract

Monocular depth estimation using novel learning-based approaches has recently emerged as a promising potential alternative to more conventional 3D scene capture technologies within real-world scenarios. Many such solutions often depend on large quantities of ground truth depth data, which is rare and often intractable to obtain. Others attempt to estimate disparity as an intermediary step using a secondary supervisory signal, leading to blurring and other undesirable artefacts. In this paper, we propose a monocular depth estimation approach, which employs a jointly-trained pixel-wise semantic understanding step to estimate depth for individually-selected groups of objects (segments) within the scene. The separate depth outputs are efficiently fused to generate the final result. This creates more simplistic learning objectives for the jointly-trained individual networks, leading to more accurate overall depth. Extensive experimentation demonstrates the efficacy of the proposed approach compared to contemporary state-of-the-art techniques within the literature.


Publication metadata

Author(s): Atapour-Abarghouei A, Breckon TP

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: IEEE International Conference on Image Processing (ICIP 2019)

Year of Conference: 2019

Pages: 4295-4299

Online publication date: 26/08/2019

Acceptance date: 30/04/2019

Date deposited: 06/02/2021

ISSN: 2381-8549

Publisher: IEEE

URL: https://doi.org/10.1109/ICIP.2019.8803551

DOI: 10.1109/ICIP.2019.8803551

Library holdings: Search Newcastle University Library for this item

ISBN: 9781538662496


Actions

Link to this publication


Share