LPVR Configuration Settings

Overview of LPVR-CAD / LPVR-DUO Signal Flow

LPVR-CAD combines input from an optical tracking source with data from an IMU attached or built-into the head mounted display. For the combiner to be able to fuse the information from both sources they need to be in the same coordinate system. Pose data from the optical tracking system is transformed into headset IMU coordinates. After the fusion the resulting pose data is transformed back into the global coordinate system and output to the 3D engine. An overview of the signal flow is shown in the diagram below. Transformations are marked in blue.

LPVR-CAD signal flow

LPVR-DUO adds another combiner, the so-called differential combiner in front of the default combiner. For the system to operate inside a car, car rotation and head rotation need to be separated. The differential combiner takes as input the HMD IMU data and combines it with gyroscope and accelerometer information from the platform IMU. The operation takes place in the IMU coordinate system. The block diagram below shows an overview of the signal flow.

LPVR-DUO signal flow

JSON File Structure Overview

The JSON configuration file defines the input/output and processing components within LPVR. More specifically there are on overall header tag and four sub-module types:

 

JSON Tag Name

Function

 

JSON Tag Name

Function

1

PoseMachineConfig

Header tag that preceeds the definition of all sub-modules

2

absoluteSources

Defines the absolute positioning (optical tracking) devices used to acquire global poses

3

imuSources

Defines sources for IMU orientation data

4

trackedObjects

Defines how objects are tracked i.e. their data sources and how the different data sources are combined

5

emitters

Defines the output target of the sensor fusion done for trackedObjects. This is usually an HMD.

The configuration file consists of these components to define the flow of input signals to output signals. The input to a typical system would consist of an absoluteSources structure for an optical tracking input, one or more imuSources and one or more emitters to output the information. All input data is combined in the trackedObjects structure that defines the parameters of the sensor fusion and signal routing. The figure below shows an overview of the structure of a configuration file that works for differential IMU headset tracking.

The outer-most structure “PoseMachineConfig” : { } should only be added when editing settings.json directly. When editing the configuration in the integrated editor on the configuration page, it needs to be left away.

Optical Tracking Sources

The absoluteSources tag describes an absolute position and orientation source like an optical tracking system. Currently LPVR supports VICON, Optitrack and ART tracking system natively, as well as the common VR communication protocol VRPN.

Optical Tracking System

Example Code Block

Explanation

Optical Tracking System

Example Code Block

Explanation

ART

"absoluteSources": [   {     "name": "my_dtrack",     "settings": {       "axisPermutation": "xyz",       "host": "192.168.1.38",       "port": 5000     },     "type": "DTrack"   } ]

name

Defines the name of the source. Any name is good.

settings: axisPermutation

Optional axis permutation setting to adjust coordinate system

settings: host

Address of the host PC

settings: port

Port number of host PC

type

Must be DTrack

VICON

"absoluteSources": [   {     "name": "my_vicon",     "settings": {       "host": "192.168.1.6:801",       "segmentName": "HMD",       "subjectName": "HMD"     },     "type": "Vicon"   } ]

 

name

Defines the name of the source. Any name is good. Will be referenced in trackedObjects further down.

settings: host

IP address of the VICON host computer running VICON Tracker, Blade etc.

settings: segmentName

Name of the rigid body in VICON software.

settings: subjectName

Should be the same as the segmentName

type

Must be Vicon

Optitrack

"absoluteSources": [   {     "name": "my_optitrack",     "settings": {       "connectionType": "Multicast",       "localAddress": "127.0.0.1",       "remoteAddress": "127.0.0.1",       "serverCommandPort": 1510,       "serverDataPort": 1511     },     "type": "OptiTrack"   } ]

name

Defines the name of the source. Any name is good. Will be referenced in trackedObjects further down.

settings: connectionType

Must be Multicast

settings: localAddress

Local address of the Optitrack client

settings: remoteAddress

Address of the Optitrack server

settings: serverCommandPort

Must be 1510

settings: serverDataPort

Must be 1511

type

Must be OptiTrack

VRPN

name

Defines the name of the source. Any name is good. Will be referenced in trackedObjects further down.

settings: tracker

Name and address of VRPN server

type

Must be VRPN

IMU Sources

The imuSources tag describes the IMU attached to the headset. At the moment only the LP-REASEARCH LPMS IMU is supported.

Example Code Block

Explanation

Example Code Block

Explanation

id

Defines the name of the source. Any name is good. Will be referenced in trackedObjects further down.

type

Can either be 

  • OpenZen to use OpenZen library

  • ViveHeadset to use internal Vive IMU

  • LpSensor to use LpSensor library (deprecated)

  • None for a dummy IMU that doesn’t emit any data

settings: name

Specifies the ID of the connected IMU (not required for ViveHeadset)

Output of Pose Calculation

The orientation and position calculated by sensor fusion of IMU and optical tracking is output to a headset by an emitter. The emitter allows setting parameters determining the position and orientation offset between IMU coordinate system and the optical coordinate system of the headset.

Example Code Block

Explanation

Example Code Block

Explanation

name

Defines the name of the output device. Any name is good. Will be referenced in trackedObjects tag.

HMD Emitter

Emits orientation to HMD

settings: imuToEyeQuat

Rotation from IMU frame to the eye frame in which the graphics are rendered

settings: imuToEyeVect

Translation from IMU frame to the eye frame in which the graphics are rendered

settings: type

Must be HMD

type

Must be OpenVR

 

Console Emitter

Displays orientation as SteamVR console output

settings: interval

Interval between log outputs in ms

name

Must be Console

Pose Calculation 

The actual pose of an object is calculated by combining IMU data and optical tracking information. This tag combines the modules we defined above. The result pose is forwarded to the emitter block.  

Example Code Block

Explanation

Example Code Block

Explanation

absoluteSource:name

Name of the previously defined absolute source (VICON etc.).

absoluteSource:trackingId

ID of object tracked by optical system.

combinerType

Type of sensor fusion used. 

  • Default: uses single-IMU fusion

  • DifferentialImu: uses differntial dual-IMU operation (LPVR-DUO)

emitterName

Name of emitter to output data to

imuSource

Name of IMU source declared above (headset IMU)

settings: absoluteFromImuFrameQuat

Orientation of the tracked body frame relative to the IMU frame

settings: absoluteFromImuFrameVect

Translation of the tracked body frame relative to the IMU frame

settings: ignoreGravity

If true, acceleromter data of the headset IMU is not used to correct HMD pitch and roll orientation.

This should be true for in vehicle applications and false for stationary installations.

settings: opticalWeight

Impact of the optical orientation tracking on oprientation measurements. Default = 0.005

settings: reference_imu

Name of the IMU to be used as reference (fixed to vehicle) IMU

settings: referenceOrientationQuat

Orientation of the reference IMU body inside the optical tracking space

settings: referenceToOpticalQuat

Rotation to translate from reference IMU internal coordinate system to optical tracking coordinate system