Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The JSON configuration file defines the input/output and processing components within LPVR. More specifically there are on overall header tag and four sub-module types:

JSON Tag Name

Function

1

PoseMachineConfig

Header tag that preceeds the definition of all sub-modules

2

absoluteSources

Defines the absolute positioning (optical tracking) devices used to acquire global poses

3

imuSources

Defines sources for IMU orientation data

4

trackedObjects

Defines how objects are tracked i.e. their data sources and how the different data sources are combined

5

emitters

Defines the output target of the sensor fusion done for trackedObjects. This is usually an HMD.

The configuration file consists of these components to define the flow of input signals to output signals. The input to a typical system would consist of an absoluteSources structure for an optical tracking input, one or more imuSources and one or more emitters to output the information. All input data is combined in the trackedObjects structure that defines the parameters of the sensor fusion and signal routing. The figure below shows an overview of the structure of a configuration file that works for differential IMU headset tracking.

Info

The outer-most structure “PoseMachineConfig” : { } should only be added when editing settings.json directly. When editing the configuration in the integrated editor on the configuration page, it needs to be left awayomitted.

...

Optical Tracking Sources

The absoluteSources tag describes an absolute position and orientation source like an optical tracking system. Currently LPVR supports VICON, Optitrack and ART tracking system natively, as well as the common VR communication protocol VRPN. "absoluteSources" is an object whose properties describe the optical tracking systems in use. The name of each property is used to reference the tracking system in the reaminder of the configuration. Examples for various tracking systems are given below.

Optical Tracking System

Example Code Block

Explanation

ART

Code Block
"absoluteSources": 
[
{
  
{     "name
"
: "
my_dtrack"
,
: {
    "settings": {
      "axisPermutation": "xyz",
      "host": "192.168.1.38",
      "port": 5000
    },
    "type": "DTrack"
  }
]Defines the name of the source. Any name is good.
}

name

type

Must be DTrack

settings: axisPermutation

Optional axis permutation setting to adjust coordinate system

settings: host

Address of the host PC

settings: port

Port number of host PC

type

Must be DTrack

VICON

Code Block
"absoluteSources": 
[
{
  
{     "name":
"my_vicon"
,
: {
    "settings": {
      "host": "192.168.1.6:801",
      "trackedObjects": [
        {
          "
segmentName
subject": "HMD",
          "
subjectName
segment": "
HMD"
root"
        }
      ]
    },
    "type": "Vicon"
  }
]Defines the name of the source. Any name is good. Will be referenced in trackedObjects further down.
}

name

type

Must be Vicon

settings: host

IP address of the VICON host computer running VICON Tracker, Blade etc.

settings:

segmentNameMust be Vicon

trackedObjects

Name of the rigid body in VICON software.

settings: subjectName

Should be the same as the segmentName

type

an array of JSON objects. Each contains "subject" (required) and "segment" (defaults to "root") and is used to identify the VICON objects used in further process. The remainder of the configuration can reference them by their numerical index (starting from zero) in this array.

Optitrack

Code Block
"absoluteSources": 
[
{
  
{     "name":
"my_optitrack"
,
: {
    "settings": {
      "connectionType": "Multicast",
      "localAddress": "127.0.0.1",
      "remoteAddress": "127.0.0.1",
      "serverCommandPort": 1510,
      "serverDataPort": 1511
    },
    "type": "OptiTrack"
  }
]Defines the name of the source. Any name is good. Will be referenced in trackedObjects further down.
}

name

type

Must be OptiTrack

settings: connectionType

Must

Can be Multicast or Unicast

settings: localAddress

Local address of the Optitrack client

settings: remoteAddress

Address of the Optitrack server

settings: serverCommandPort

Must be 1510

settings: serverDataPort

Must be 1511

type

Must be OptiTrack

VRPN

Code Block
"absoluteSources": 
[
{
  
{     
"
name": "
my_vrpn"
,
: {
    "settings": { 
      "tracker": "DTrack@127.0.0.1"
    },
    "type": "VRPN"
  }
]Defines the name of the source. Any name is good. Will be referenced in trackedObjects further down.
}

name

type

Must be VRPN

settings: tracker

Name and address of VRPN server

type

Must be VRPN

IMU Sources

The imuSources tag describes the IMU attached to the headset. At the moment only the LP-REASEARCH LPMS IMU is supportedIMUs in use. It is a JSON object whose properties define the individual IMUs.

Example Code Block

Explanation

Code Block
"imuSources": 
[
{
  
{     
"
id": "
my_imu"
,
: {
    "type": "ViveHeadset"
  },
  
{     "id":
"reference_imu"
,
: {
    "settings": {
      "name":  "lpmscu2000327",
      "autodetectType": "ig1"
    },
    "type": "OpenZen"
  },
  
{     
"
id": "
no_imu"
,
: {
    "type": "None"
  }
]

id

Defines
}

Each member of the "imuSources" object has an identifier which defines the name of the source. Any name is good. Will be referenced in trackedObjects further down. The individual objects contain a required "type" field and an optional "settings" field.

type

Can either be 

  • OpenZen to use the OpenZen library

  • ViveHeadset to use internal Vive IMU

  • OpenVR for the IMUs in SteamVR controllers

  • Varjo for the Varjo headset IMU

  • LpSensor to use LpSensor library (deprecated)

  • None for a dummy IMU that doesn’t emit any data

settings: name

Specifies the ID of the connected IMU (not required for ViveHeadset).

settings: autodetectType

For "type": "OpenZen" the driver will use autodetection if "name" is omitted. Usually the default "ig1" is appropriate, but in some experimental environments "lpms"might be the correct type.

Output of Pose Calculation

The orientation and position calculated by sensor fusion of IMU and optical tracking is output to a headset by an emitter. The emitter allows setting parameters determining the position and orientation offset between IMU coordinate system and the optical coordinate system of the headset.

Example Code Block

Explanation

Code Block
"emitters": [
  {
    "name": "HMD",
    "settings": {
      "
imuToEyeQuat
imuFromEyeQuat": {
        "w": 1,
        "x": 0,
        "y": 0,
        "z": 0
      },
      "
imuToEyeVect
imuFromEyeVect": {
        "x": 0,
        "y": 0,
        "z": 0
      },
      "type": "HMD"
    },
    "type": "OpenVR"
  },
  {
    "name": "console",
    "settings": {
      "interval": 10
    },
    "type": "Console"
  }
]

name

Defines the name of the output device. Any name is good. Will be referenced in trackedObjects tag.

HMD Emitter

Emits orientation to HMD

settings:

imuToEyeQuat

imuFromEyeQuat

Rotation from

IMU

eye frame to the

eye

IMU frame in which the graphics are rendered. To operate in the optical coordinate system, imuFromEyeQuat must the the inverse of absoluteFromImuFrameQuat. Note: imuToEyeQuat is not a valid parameter, please don’t use.

settings:

imuToEyeVect

imuFromEyeVect

Translation from

IMU

eye frame to

the eye

IMU frame in which the graphics are rendered. Note: imuToEyeVect is not a valid parameter, please don’t use.

settings: type

Must be HMD

type

Must be OpenVR

Console Emitter

Displays orientation as SteamVR console output

settings: interval

Interval between log outputs in ms

name

Must be Console

Pose Calculation 

The actual pose of an object is calculated by combining IMU data and optical tracking information. This tag combines the modules we defined above. The result pose is forwarded to the emitter block.  

Example Code Block

Explanation

Code Block
"trackedObjects": [
  {
    "absoluteSource": {
      "name": "my_vicon"
      "trackingId": 0
    },
    "combinerType": "DifferentialImu",
    "emitterName": "HMD",
    "imuSource": "my_imu",
    "settings": {
      "absoluteFromImuFrameQuat": {
        "w": 1,
        "x": 0,
        "y": 0,
        "z": 0
      },
      "absoluteFromImuFrameVect": {
        "x": 0,
        "y": 0,
        "z": 0
      }

      "ignoreGravity": true,
      "opticalWeight": 0.005,
      "referenceImu": "reference_imu",
      "referenceOrientationQuat": {
        "w": 1,
        "x": 0,
        "y": 0,
        "z": 0
      },
      "referenceToOpticalQuat": {
        "w": 1,
        "x": 0,
        "y": 0,
        "z": 0
       }
     }
  }
]

absoluteSource:name

Name of the previously defined absolute source (VICON etc.).

absoluteSource:trackingId

ID of object tracked by optical system.

combinerType

Type of sensor fusion used. 

  • Default: uses single-IMU fusion

  • DifferentialImu: uses differntial dual-IMU operation (LPVR-DUO)

emitterName

Name of emitter to output data to

imuSource

Name of IMU source declared above (headset IMU)

settings: absoluteFromImuFrameQuat

Orientation of the tracked body frame relative to the IMU frame. Make sure this parameter is spelt correctly in your configuration, there’s been some confusion also internally here at LP about the exact naming convention.

settings: absoluteFromImuFrameVect

Translation of the tracked body frame relative to the IMU frame

settings: ignoreGravity

If true, acceleromter data of the headset IMU is not used to correct HMD pitch and roll orientation.

This should be true for in vehicle applications and false for stationary installations.

settings: opticalWeight

Impact of the optical orientation tracking on oprientation measurements. Default = 0.005

settings: reference_imu

Name of the IMU to be used as reference (fixed to vehicle) IMU

settings: referenceOrientationQuat

Orientation of the reference IMU body inside the optical tracking space

settings: referenceToOpticalQuat

Rotation to translate from reference IMU internal coordinate system to optical tracking coordinate system

LPVIZ: Procedure for Manual Adjustment of HMD Marker Target

  • Adjust the optical target first: set imuSource to no_imu (need to create no_imu first) and adjust the target live in DTrack (ART) or Motive (Optitrack). Make sure the coordinate system center in the Steamvr grid is eactly in the center between the two Smarttrack cameras or whereever you set the center of your coordinate system.

  • Then, switch on the the HMD IMU (imuSource: hmd imu) and calibrate optical vs. imu.

  • To operate in the optical coordinate system, imuFromEyeQuat must the the INVERSE of absoluteFromImuFrameQuat.

  • Check a simple Unreal or Unity test scene with a single object at (0, 0, 0) and player start at (0, 0, 0) rot=(0, 0, 180). you should observe very little distortion and shifting when looking around the object.

  • Only after this has been confirmed, test in the moving vehicle.