LPVR Configuration Settings

Overview of LPVR-CAD / LPVR-DUO Signal Flow

LPVR-CAD combines input from an optical tracking source with data from an IMU attached or built-into the head mounted display. For the combiner to be able to fuse the information from both sources they need to be in the same coordinate system. Pose data from the optical tracking system is transformed into headset IMU coordinates. After the fusion the resulting pose data is transformed back into the global coordinate system and output to the 3D engine. An overview of the signal flow is shown in the diagram below. Transformations are marked in blue.

LPVR-CAD signal flow

LPVR-DUO adds another combiner, the so-called differential combiner in front of the default combiner. For the system to operate inside a car, car rotation and head rotation need to be separated. The differential combiner takes as input the HMD IMU data and combines it with gyroscope and accelerometer information from the platform IMU. The operation takes place in the IMU coordinate system. The block diagram below shows an overview of the signal flow.

LPVR-DUO signal flow

JSON File Structure Overview

The JSON configuration file defines the input/output and processing components within LPVR. More specifically there are on overall header tag and four sub-module types:

 

JSON Tag Name

Function

 

JSON Tag Name

Function

1

PoseMachineConfig

Header tag that preceeds the definition of all sub-modules

2

absoluteSources

Defines the absolute positioning (optical tracking) devices used to acquire global poses

3

imuSources

Defines sources for IMU orientation data

4

trackedObjects

Defines how objects are tracked i.e. their data sources and how the different data sources are combined

5

emitters

Defines the output target of the sensor fusion done for trackedObjects. This is usually an HMD.

The configuration file consists of these components to define the flow of input signals to output signals. The input to a typical system would consist of an absoluteSources structure for an optical tracking input, one or more imuSources and one or more emitters to output the information. All input data is combined in the trackedObjects structure that defines the parameters of the sensor fusion and signal routing. The figure below shows an overview of the structure of a configuration file that works for differential IMU headset tracking.

The outer-most structure “PoseMachineConfig” : { } should only be added when editing settings.json directly. When editing the configuration in the integrated editor on the configuration page, it needs to be omitted.

Optical Tracking Sources

The absoluteSources tag describes an absolute position and orientation source like an optical tracking system. Currently LPVR supports VICON, Optitrack and ART tracking system natively, as well as the common VR communication protocol VRPN. "absoluteSources" is an object whose properties describe the optical tracking systems in use. The name of each property is used to reference the tracking system in the reaminder of the configuration. Examples for various tracking systems are given below.

Optical Tracking System

Example Code Block

Explanation

 

Optical Tracking System

Example Code Block

Explanation

 

ART

"absoluteSources": {   "my_dtrack": {     "settings": {       "axisPermutation": "xyz",       "host": "192.168.1.38",       "port": 5000     },     "type": "DTrack"   } }

type

Must be DTrack

settings: axisPermutation

Optional axis permutation setting to adjust coordinate system

settings: host

Address of the host PC

settings: port

Port number of host PC

 

VICON

"absoluteSources": {   "my_vicon": {     "settings": {       "host": "192.168.1.6:801", "trackedObjects": [ {        "subject": "HMD",        "segment": "root" } ]     },     "type": "Vicon"   } }

 

type

Must be Vicon

settings: host

IP address of the VICON host computer running VICON Tracker, Blade etc.

settings: trackedObjects

an array of JSON objects. Each contains "subject" (required) and "segment" (defaults to "root") and is used to identify the VICON objects used in further process. The remainder of the configuration can reference them by their numerical index (starting from zero) in this array.

 

Optitrack

"absoluteSources": {   "my_optitrack": {     "settings": {       "connectionType": "Multicast",       "localAddress": "127.0.0.1",       "remoteAddress": "127.0.0.1",       "serverCommandPort": 1510,       "serverDataPort": 1511     },     "type": "OptiTrack"   } }

type

Must be OptiTrack

settings: connectionType

Can be Multicast or Unicast

settings: localAddress

Local address of the Optitrack client

settings: remoteAddress

Address of the Optitrack server

settings: serverCommandPort

Must be 1510

settings: serverDataPort

Must be 1511

 

VRPN

type

Must be VRPN

settings: tracker

Name and address of VRPN server

 

 

IMU Sources

The imuSources tag describes the IMUs in use. It is a JSON object whose properties define the individual IMUs.

Example Code Block

Explanation

Example Code Block

Explanation

Each member of the "imuSources" object has an identifier which defines the name of the source. Any name is good. Will be referenced in trackedObjects further down. The individual objects contain a required "type" field and an optional "settings" field.

type

Can either be 

  • OpenZen to use the OpenZen library

  • ViveHeadset to use internal Vive IMU

  • OpenVR for the IMUs in SteamVR controllers

  • Varjo for the Varjo headset IMU

  • LpSensor to use LpSensor library (deprecated)

  • None for a dummy IMU that doesn’t emit any data

settings: name

Specifies the ID of the connected IMU (not required for ViveHeadset).

settings: autodetectType

For "type": "OpenZen" the driver will use autodetection if "name" is omitted. Usually the default "ig1" is appropriate, but in some experimental environments "lpms"might be the correct type.

Output of Pose Calculation

The orientation and position calculated by sensor fusion of IMU and optical tracking is output to a headset by an emitter. The emitter allows setting parameters determining the position and orientation offset between IMU coordinate system and the optical coordinate system of the headset.

Example Code Block

Explanation

Example Code Block

Explanation

name

Defines the name of the output device. Any name is good. Will be referenced in trackedObjects tag.

HMD Emitter

Emits orientation to HMD

settings: imuFromEyeQuat

Rotation from eye frame to the IMU frame in which the graphics are rendered. To operate in the optical coordinate system, imuFromEyeQuat must the the inverse of absoluteFromImuFrameQuat. Note: imuToEyeQuat is not a valid parameter, please don’t use.

settings: imuFromEyeVect

Translation from eye frame to IMU frame in which the graphics are rendered. Note: imuToEyeVect is not a valid parameter, please don’t use.

settings: type

Must be HMD

type

Must be OpenVR

Console Emitter

Displays orientation as SteamVR console output

settings: interval

Interval between log outputs in ms

name

Must be Console

Pose Calculation 

The actual pose of an object is calculated by combining IMU data and optical tracking information. This tag combines the modules we defined above. The result pose is forwarded to the emitter block.  

Example Code Block

Explanation

Example Code Block

Explanation

absoluteSource:name

Name of the previously defined absolute source (VICON etc.).

absoluteSource:trackingId

ID of object tracked by optical system.

combinerType

Type of sensor fusion used. 

  • Default: uses single-IMU fusion

  • DifferentialImu: uses differntial dual-IMU operation (LPVR-DUO)

emitterName

Name of emitter to output data to

imuSource

Name of IMU source declared above (headset IMU)

settings: absoluteFromImuFrameQuat

Orientation of the tracked body frame relative to the IMU frame. Make sure this parameter is spelt correctly in your configuration, there’s been some confusion also internally here at LP about the exact naming convention.

settings: absoluteFromImuFrameVect

Translation of the tracked body frame relative to the IMU frame

settings: ignoreGravity

If true, acceleromter data of the headset IMU is not used to correct HMD pitch and roll orientation.

This should be true for in vehicle applications and false for stationary installations.

settings: opticalWeight

Impact of the optical orientation tracking on oprientation measurements. Default = 0.005

settings: reference_imu

Name of the IMU to be used as reference (fixed to vehicle) IMU

settings: referenceOrientationQuat

Orientation of the reference IMU body inside the optical tracking space

settings: referenceToOpticalQuat

Rotation to translate from reference IMU internal coordinate system to optical tracking coordinate system

LPVIZ: Procedure for Manual Adjustment of HMD Marker Target

  • Adjust the optical target first: set imuSource to no_imu (need to create no_imu first) and adjust the target live in DTrack (ART) or Motive (Optitrack). Make sure the coordinate system center in the Steamvr grid is eactly in the center between the two Smarttrack cameras or whereever you set the center of your coordinate system.

  • Then, switch on the the HMD IMU (imuSource: hmd imu) and calibrate optical vs. imu.

  • To operate in the optical coordinate system, imuFromEyeQuat must the the INVERSE of absoluteFromImuFrameQuat.

  • Check a simple Unreal or Unity test scene with a single object at (0, 0, 0) and player start at (0, 0, 0) rot=(0, 0, 180). you should observe very little distortion and shifting when looking around the object.

  • Only after this has been confirmed, test in the moving vehicle.