This page has been translated automatically.
Video Tutorials
Interface
Essentials
Advanced
How To
Basics
Rendering
Professional (SIM)
UnigineEditor
Interface Overview
Assets Workflow
Version Control
Settings and Preferences
Working With Projects
Adjusting Node Parameters
Setting Up Materials
Setting Up Properties
Lighting
Sandworm
Using Editor Tools for Specific Tasks
Extending Editor Functionality
Built-in Node Types
Nodes
Objects
Effects
Decals
Light Sources
Geodetics
World Nodes
Sound Objects
Pathfinding Objects
Players
Programming
Fundamentals
Setting Up Development Environment
Usage Examples
C++
C#
UnigineScript
UUSL (Unified UNIGINE Shader Language)
Plugins
File Formats
Materials and Shaders
Rebuilding the Engine Tools
GUI
Double Precision Coordinates
API
Animations-Related Classes
Containers
Common Functionality
Controls-Related Classes
Engine-Related Classes
Filesystem Functionality
GUI-Related Classes
Math Functionality
Node-Related Classes
Objects-Related Classes
Networking Functionality
Pathfinding-Related Classes
Physics-Related Classes
Plugins-Related Classes
IG Plugin
CIGIConnector Plugin
Rendering-Related Classes
VR-Related Classes
Content Creation
Content Optimization
Materials
Material Nodes Library
Miscellaneous
Input
Math
Matrix
Textures
Art Samples
Tutorials
Warning! This version of documentation is OUTDATED, as it describes an older SDK version! Please switch to the documentation for the latest SDK version.
Warning! This version of documentation describes an old SDK version which is no longer supported! Please upgrade to the latest SDK version.

Tracking Hands and Fingers With LeapMotion Plugin

Overview#

LeapMotion plugin allows you to track hands and fingers in your UNIGINE-based application.

Notice
The plugin is available only on Windows.

The Leap Motion system recognizes and tracks hands and fingers. The device operates in an intimate proximity with high precision and tracking frame rate and reports discrete positions and motion.

The Leap Motion controller uses optical sensors and infrared light. The sensors are directed along the Y axis — upward when the controller is in its standard operating position — and have a field of view of about 150 degrees. The effective range of the Leap Motion Controller extends from approximately 25 to 600 millimeters above the device (1 inch to 2 feet).

Leap Motion controller's view of your hands

Detection and tracking work best when the controller has a clear, high-contrast view of an object's silhouette. The Leap Motion software combines its sensor data with an internal model of the human hand to help cope with challenging tracking conditions.

Coordinate System#

The Leap Motion system uses a right-handed Cartesian coordinate system. The origin is centered at the top of the Leap Motion Controller. The X and Z axes lie in the horizontal plane, with the X axis running parallel to the long edge of the device. The Y axis is vertical, with positive values increasing upwards (in contrast to the downward orientation of most computer graphics coordinate systems). The Z axis has positive values increasing toward the user.

Leap Motion right-handed coordinate system

Hands#

The hand model provides information about the identity, position, and other characteristics of a detected hand, the arm to which the hand is attached, and lists of the fingers associated with the hand.

Hands are represented by the Hand class.

Hand normal and direction vectors define the orientation of the hand
Notice
More than two hands can appear in the hand list for a frame if more than one person's hands or other hand-like objects are in view. However, it is recommended to keep at most two hands in the Leap Motion Controller's field of view for optimal motion tracking quality.

Arms#

An arm is a bone-like object that provides the orientation, length, width, and end points of an arm. When the elbow is not in view, the Leap Motion controller estimates its position based on past observations as well as typical human proportion.

Arms are represented by the Arm class.

Fingers#

The Leap Motion controller provides information about each finger on a hand. If all or part of a finger is not visible, the finger characteristics are estimated based on recent observations and the anatomical model of the hand. Fingers are identified by type name, i.e. thumb, index, middle, ring, pinky.

Fingers are represented by the Finger class.

Finger tip position and direction provide the position of a finger tip and the general direction in which a finger is pointing

Bones#

Each finger has a set of bones describing the position and orientation of corresponding anatomical finger bones. All fingers contain four bones ordered from base to tip.

Bones are represented by the Bone class.

Palm and all its finger bones

The bones are identified as:

  • Metacarpal — the bone inside the hand connecting the finger to the wrist (except the thumb).
  • Proximal Phalanx — the bone at the base of the finger, connected to the palm.
  • Intermediate Phalanx — the middle bone of the finger, between the tip and the base.
  • Distal Phalanx — the terminal bone at the end of the finger.
Notice
Such model for the thumb does not quite match the standard anatomical naming system. A real thumb has one less bone than the other fingers. However, for ease of programming, the Leap Motion thumb model includes a zero-length metacarpal bone so that the thumb has the same number of bones at the same indexes as the other fingers. As a result the thumb's anatomical metacarpal bone is labeled as a proximal phalanx and the anatomical proximal phalanx is labeled as the intermediate phalanx in the Leap Motion finger bone model.

Sensor Images#

Along with the computed tracking data, you can get the raw sensor images from the Leap Motion cameras.

A raw sensor image with superimposed calibration points

The image data contains the measured IR brightness values and the calibration data required to correct for the complex lens distortion. You can use the sensor images for augmented reality applications, especially when the Leap Motion hardware is mounted to a VR headset.

See Also#

LeapMotion API:

  • The LeapMotion interface article for more details on managing LeapMotion via API.
  • The LeapMotion Arm class article for more details on managing arms via API.
  • The LeapMotion Bone class article for more details on managing finger bones via API.
  • The LeapMotion Finger class article for more details on managing fingers via API.
  • The LeapMotion Hand class article for more details on managing hands via API.

Implementing Unigine Application with LeapMotion Support#

To use the LeapMotion plugin in your UNIGINE application, perform the following:

  1. Download the Leap Motion SDK and install Leap Motion device drivers.
  2. Create a new project with LeapMotion support via UNIGINE SDK Browser: click Plugins, check the LeapMotion support (LeapMotion plugin) option in the form that opens and click OK.

    LeapMotion Plugin on Plugins Panel
    Notice
    To add LeapMotion support to the existing project, in UNIGINE SDK Browser, click Other Actions -> Configure Project -> Plugins -> LeapMotion support (LeapMotion plugin) -> OK.
  3. Implement your application.
  4. Launch the LeapMotion plugin on the application start-up.

Launching LeapMotion#

To launch the plugin, specify the extern_plugin command line option on the application start-up as follows:

Shell commands
main_x64d -extern_plugin UnigineLeapMotion

If you run the application via UNIGINE SDK Browser, specify the command-line options given above in the Customize Run Options form.

Last update: 2024-08-16
Build: ()