Link Search Menu Expand Document

Pose Detection

  1. Overview
  2. Pose Detection
    1. Inference Engine and Algorithm
    2. Running Pose Detection
      1. Using Images for Inference
        1. Default Image
        2. Custom Image
      2. Using Video Source for Inference
        1. Video File
        2. Video Camera or Webcam
    3. Extra Parameters
  3. References

Overview

In computer vision and robotics, a typical task is to identify specific objects in an image and to determine each object’s position and orientation relative to some coordinate system. This information can then be used, for example, to allow a robot to manipulate an object or to avoid moving into the object. The combination of position and orientation is referred to as the pose of an object, even though this concept is sometimes used only to describe the orientation. Exterior orientation and translation are also used as synonyms of pose 1 .

Pose Detection

Inference Engine and Algorithm

tfliteframework

This demo uses:

  • TensorFlow Lite as an inference engine 2 ;
  • MobileNet as default algorithm 3 .

More details on eIQ™ page.

NOTE: This demo needs a quantized model to work properly.

Running Pose Detection

Using Images for Inference

Default Image
  1. Run the Pose Detection demo using the following line:
    # pyeiq --run pose_detection
    
    • This runs inference on a default image: posenet_detection
Custom Image
  1. Pass any image as an argument:
    # pyeiq --run pose_detection --image=/path_to_the_image
    

Using Video Source for Inference

Video File
  1. Run the Pose Detection using the following line:
    # pyeiq --run pose_detection --video_src=/path_to_the_video
    
Video Camera or Webcam
  1. Specify the camera device:
    # pyeiq --run pose_detection --video_src=/dev/video<index>
    

Extra Parameters

  1. Use –help argument to check all the available configurations:
    # pyeiq --run pose_detection --help
    

References

  1. https://en.wikipedia.org/wiki/Pose_(computer_vision) 

  2. https://www.tensorflow.org/lite 

  3. https://arxiv.org/abs/1704.04861