A depth image of an object can be input to a deep neural network to determine a first four degree-of-freedom pose of the object. The first four degree-of-freedom pose and a three-dimensional model of the object can be input to a silhouette rendering program to determine a first two-dimensional silhouette of the object. A second two-dimensional silhouette of the object can be determined based on thresholding the depth image. A loss function can be determined based on comparing the first two-dimensional silhouette of the object to the second two-dimensional silhouette of the object. Deep neural network parameters can be optimized based on the loss function and the deep neural network can be output.


    Access

    Download


    Export, share and cite



    Title :

    Object pose estimation


    Contributors:

    Publication date :

    2024-06-11


    Type of media :

    Patent


    Type of material :

    Electronic Resource


    Language :

    English


    Classification :

    IPC:    G06T Bilddatenverarbeitung oder Bilddatenerzeugung allgemein , IMAGE DATA PROCESSING OR GENERATION, IN GENERAL / B60W CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION , Gemeinsame Steuerung oder Regelung von Fahrzeug-Unteraggregaten verschiedenen Typs oder verschiedener Funktion / G05B Steuer- oder Regelsysteme allgemein , CONTROL OR REGULATING SYSTEMS IN GENERAL / G06N COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS , Rechnersysteme, basierend auf spezifischen Rechenmodellen



    OBJECT POSE ESTIMATION

    SHRIVASTAVA SHUBHAM / PANDEY GAURAV / CHAKRAVARTY PUNARJAY | European Patent Office | 2023

    Free access

    OBJECT POSE ESTIMATION

    PAVONE MARCO / YANG HENG | European Patent Office | 2024

    Free access

    Pose-RCNN: Joint object detection and pose estimation using 3D object proposals

    Braun, Markus / Qing Rao / Wang, Yikang et al. | IEEE | 2016



    Vision-Based Categorical Object Pose Estimation and Manipulation

    Meng, Qiwei / Liao, Jianfeng / Jun, Shao et al. | TIBKAT | 2023