D2NT: A High-Performing Depth-to-Normal Translator

Tongji University, HKUST
ICRA 2023

Abstract

Surface normal holds significant importance in visual environmental perception, serving as a source of rich geometric information. However, the state-of-the-art surface normal estimators (SNEs) generally suffer from an unsatisfactory trade-off between efficiency and accuracy. To resolve this dilemma, this paper first presents a superfast depth-to-normal translator (D2NT), which can directly translate depth images into surface normal maps without calculating 3D coordinates. We then propose a discontinuity-aware gradient filter, which adaptively generates gradient convolution kernels to improve depth gradient estimation. Finally, we propose a surface normal refinement module that can easily be integrated into any depth-to-normal SNEs, substantially improving the surface normal estimation accuracy. Our proposed algorithm demonstrates the best accuracy among all other existing real-time SNEs and achieves the SoTA trade-off between efficiency and accuracy.

Methodology

The illustration of our proposed D2NT, DAG filter, and MNR module. D2NT translates depth images into surface normal maps in an end-to-end fashion; DAG filter adaptively generates smoothness-guided direction weights for improved depth gradient estimation in and around discontinuities; MNR module further refines the estimated surface normals based on the smoothness of neighboring pixels.

Experimental results

Video Presentation

[Bilibili]      [Youtube]

BibTeX

@inproceedings{feng2023d2nt,
  title={D2nt: A high-performing depth-to-normal translator},
  author={Feng, Yi and Xue, Bohuan and Liu, Ming and Chen, Qijun and Fan, Rui},
  booktitle={2023 IEEE international conference on robotics and automation (ICRA)},
  pages={12360--12366},
  year={2023},
  organization={IEEE}
}
        

Acknowledgements

This work was supported by the National Key R&D Program of China under Grant 2020AAA0108100, the National Natural Science Foundation of China under Grant 62233013, the Science and Technology Commission of Shanghai Municipal under Grant 22511104500, the Fundamental Research Funds for the Central Universities under Grants 22120220184 and 22120220214, and the Shanghai Municipal Science and Technology Major Project under Grant 2021SHZDZX0100.