EPro-PnP: Generalized End-to-End Probabilistic Perspective-N-Points for Monocular Object Pose Estimation.
Journal
IEEE transactions on pattern analysis and machine intelligence
ISSN: 1939-3539
Titre abrégé: IEEE Trans Pattern Anal Mach Intell
Pays: United States
ID NLM: 9885960
Informations de publication
Date de publication:
16 Jan 2024
16 Jan 2024
Historique:
medline:
16
1
2024
pubmed:
16
1
2024
entrez:
16
1
2024
Statut:
aheadofprint
Résumé
Locating 3D objects from a single RGB image via Perspective-n-Point (PnP) is a long-standing problem in computer vision. Driven by end-to-end deep learning, recent studies suggest interpreting PnP as a differentiable layer, allowing for partial learning of 2D-3D point correspondences by backpropagating the gradients of pose loss. Yet, learning the entire correspondences from scratch is highly challenging, particularly for ambiguous pose solutions, where the globally optimal pose is theoretically non-differentiable w.r.t. the points. In this paper, we propose the EPro-PnP, a probabilistic PnP layer for general end-to-end pose estimation, which outputs a distribution of pose with differentiable probability density on the SE(3) manifold. The 2D-3D coordinates and corresponding weights are treated as intermediate variables learned by minimizing the KL divergence between the predicted and target pose distribution. The underlying principle generalizes previous approaches, and resembles the attention mechanism. EPro-PnP can enhance existing correspondence networks, closing the gap between PnP-based method and the task-specific leaders on the LineMOD 6DoF pose estimation benchmark. Furthermore, EPro-PnP helps to explore new possibilities of network design, as we demonstrate a novel deformable correspondence network with the state-of-the-art pose accuracy on the nuScenes 3D object detection benchmark. Our code is available at https://github.com/tjiiv-cprg/EPro-PnP-v2.
Identifiants
pubmed: 38227417
doi: 10.1109/TPAMI.2024.3354997
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM