FusionHub Manual

To make it easier to read, we’ve split the FusionHub documentation into two sub-manuals:
LPVR-AIR Manual - LPVR-AIR Manual (Client-Side Version)

LPVR-POS Manual - LPVR-POS Manual

Please refer to these for the most up-to-date documentation.

 

Introduction

FusionHub is a software application that has the purpose of combining a number of sensor data inputs to create a higher level information output. There are 3 basic versions of FusionHub:

  • FusionHub BASE combines data from an outside-in tracking system with inertial measurements done by an inertial measurement unit (IMU). Typical applications: Head-mounted display tracking for VR/AR applications, camera tracking for virtual production

  • FusionHub MOVE adds an additional platform IMU to the BASE configuration. It combines data from both IMUs to calculate poses relative to a moving platform. Typical applications: AR/VR in a vehicle, aircraft, or on a simulator platform

  • FusionHub FLOW combines odometry, GPS and IMU data from a vehicle to calculate high-accuracy and low-latency global localization information. Typical applications: Automobile localization, robot localization

The diagram below shows the general structure of FusionHub. Sources and sinks are connected by a filter unit. The sensor fusion functionality is contained in this filter unit. The filter parameters as well as the parameters of input and output blocks can be configured via a configuration script or the graphical user interface.

The graphical user interface is detached from the main FusionHub application and both applications can therefore run on separate computers. This provides flexibility for running FusionHub on devices with limited monitoring capabilities like a head mounted display.

General

Starting FusionHub

FusionHub consists of two components:

  • The main application

  • A graphical user interface application

Insert the security dongle into a USB port of your computer.

The main FusionHub application is started by running FusionHub.exe. No specific installation is needed, the application can be run directly out of its deployment directory. It is a command line application that uses the file config.json for its configuration. We will explan the contents and options of the configuration file further below.

Please install the graphical user interface by running lp-fusionhub-dashboard_0.1.0_x64_en-US.msi. It installs lp-fusionhub-dashboard in your start menu, launch the application from there. Press the Connect button after starting FusionHub.exe to connect client and server. In case you are running FusionHub on a separate machine make sure to enter the correct IP address.

The screenshot below shows the connection elements of the GUI.

Licensing

FusionHub has two options for license protection:

Hardware dongle

License authentication using a hardware dongle; This is especially interesting for air-gapped installations that are not connected to the internet. As long as the dongle is inserted into a USB slot of the host system, FusionHub will run. Please note that for the Android (Quest 2 HMD) version of FusionHub, the GUI running on the streaming host is dongle protected, see more detailed information in the specific manual chapter.

Online license

License authentication using a software, online license; This makes sense for systems that are connected to the internet at least during the initial installation of FusionHub. The software checks its license status with our license server with following sequence:

  1. Enter license key in configuration file. You receive your personal license key from us.

  2. Send license key and machine code to server

  3. Server checks if license is valid and returns response code, if it is valid

  4. Copy the response code from the log and enter it in the config file to the ResponseKey parameter. Save the config file.

  5. This allows FusionHub to run on this specific machine without reconnecting to the internet. One license unit will be subtracted from your license account. Please ask us for assitance if you’d like to move your license.

If your default configuration file config.json doesn't contain it already, add the LicenseInfo block as shown below. Enter your personal key you received from us as LicenseKey.

{ ... "LicenseInfo": { "LicenseKey": "EKKCO-GZYLT-NJKET-SASDC", "ResponseKey": "" } ... }

BASE Filter Configuration

FusionHub BASE combines data from an outside-in tracking system with inertial measurements by an inertial measurement unit (IMU). The BASE filter integrates the angular velocity measured by the IMU’s gyroscope and corrects it by the pose of the optical target that is determined with the optical tracking system. This references the calculated pose to the coordinate system of the optical tracking system and avoids drift while maintaining the high frequency and responsiveness of the gyroscope data. The diagram below shows an overview of a BASE filter system.

The position output of the BASE filter is transferred directly from the optical measurements without modification. Pose prediction and interpolation of position measurements by accelerometer integration are under development.

Setup

  • Setup your optical tracking system. Attach the IMU to the optical target or attach both to the same rigid object eg. an HMD. Initialize the optical tracking body in your motion capture software and note the object ID.

  • Connect your IMU to the computer running FusionHub. Make sure your computer can connect to the IMU and read data by using LpmsControl 2. Make sure to disconnect from LpmsControl before running FusionHub.

  • Modify config.json to contain the correct information for your IMU and optical tracking system. See below how to configure the blocks in the configuration file. The configuration file can also be modified through the FusionHub GUI as shown further below.

Operation

If all components are connected and the configuration file is valid, FusionHub should work right away after starting the application. The console output shows a log of the initialization of the various components. Note that you can log the output from FusionHub to a file by adding

"record": { "filename": "log.a", "format": "json" }

to the sink section of config.json.

After starting and connecting the GUI the Auto Calibration section of Fusion Config should show increasing numbers for nImu (number of recorded IMU samples) and nOptical (number of recorded optical samples).

Calibration

There are two calibration steps that are required to operate the BASE filter:

Gyroscope Autocalibration

Gyroscope sensors have a built-in measurement bias that changes over time and is temperature-dependent. Good, permanent temperature calibration of MEMS gyroscopes is hard to achieve, therefore FusionHub offers the possibility to run-time calibrate this offset. This calibration is semi-automatic.

The measurement bias of the gyroscope attached to the tracked object is calculated as an average of the data acquired over a certain time interval. Requirement for this sampling to happen is for the object to be in a non-moving / static state. The state of the object is determined by input data from the optical tracking. So once the optical tracking system (eg. ART DTrack) reports the optical target to be static, gyroscope data will be sampled, averaged and a new bias compensation vector calculated.

The result of the autocalibration is saved in autocalibValue.json. When starting FusionHub for the first time, this offset is set to (0, 0, 0). Make sure to place the target, with the IMU attached, within the tracking volume and keep it static eg. by putting it on the floor.

IMU-Optical Intercalibration

The IMU-optical intercalibration calibrates the orientation difference between IMU and optical tracking body. When setting up a new system or after modifying the optical target a (re-)calibration is needed. The calibration is started by running FusionHub with the runIntercalibration option set to true.

Rotate the target with the IMU attached slowly within the tracking volume. You can monitor the status of the intercalibration in the Intercalibration section on the Fusion Config page of the GUI. After around 50 sampled poses the intercalibration should be finished and the GUI should show the resulting calibration quaternion.

Click Apply Intercalibration Result to automatically insert the result into the configuration file. Click Set and Save at the bottom of the editor to save the result and restart FusionHub.

Check the 3D View page to confirm if the intercalibration result is correct. The red and white cube should overlap alpmost exactly at all times when you rotate your object inside the tracking volume. Note that after a restart it might take a few seconds for optical and fused pose to converge.

IMU-Optical Fusion Filter

Example Configuration

Real-time IMU-optical fusion with LPMS-IG1 and ART Dtrack: imuOpticalFusion.json

Configuration Block

Node name: fusion

"fusion": { "type": "ImuOpticalFusion", "settings": { "echoFusedPose": false, "echoOpticalPose": true, "runIntercalibration": true, "Autocalibration": { "minAgeS": 60.0, "nSamplesForAutocalibration": 1500, "nSamplesForSteady": 256, "noiseRmsLimit": 0.02, "steadyThresholdAverage": 0.2, "steadyThresholdRms": 1.0 }, "MotionDetection": { "omegaLimit": 2.0, "positionSampleInterval": 1000, "rotationFilterAlpha": 0.9, "timeToUnknown": 500 }, "SensorFusion": { "alignment": { "w": 1.0, "x": 0.0, "y": 0.0, "z": 0.0 }, "orientationWeight": 0.005, "tiltCorrection": null, "yawWeight": 0.01, "predictionInterval": 0.01, "sggPointsEachSide": 5, "sggPolynomialOrder": 5 } } }

Parameter name

Description

Default

Parameter name

Description

Default

type

Type of sensor fusion. At the moment only default option possible.

ImuOpticalFusion

echoFusedPose

Print fused pose like it is output

false

echoOpticalPose

Print optical pose like it is received by fusion

false

runIntercalibration

Starts intercalibration between IMU and optical target

true

minAgeS

Minimum time between two autocalibrations

60.0

nSamplesForAutocalibration

Number of samples used by autocalibration

1500

nSamplesForSteady

Number of samples needed below threshold to trigger calibration

256

noiseRmsLimit

Noise limit

0.02

steadyThresholdAverage

Threshold average limit

0.2

steadyThresholdRms

Threshold RMS limit

1.0

omegaLimit

Omega limit

2.0

positionSampleInterval

Interval between two position samples for motion detection

1000

rotationFilterAlpha

Weight for rotation low-pass filter

0.9

timeToUnknown

Interval to autocalibration “unknown” state

500

alignment

Alignment quaternion between IMU and optical target. Insert the result of the intercalibration here.

1, 0, 0, 0

orientationWeight

Amount of correction of angle calculated from gyroscope data by optical measurements (roll, pitch, yaw)

0.005

tiltCorrection

Specify for correcting tilt of angle calculated from gyroscope data by vertical calculated from gravity measurements. This feature is not available yet.

null

yawWeight

Amount of yaw correction by optical data, if tilt correction is active

0.01

predictionInterval

Time to look into the future for calculation of the output quaternion

0.01

sggPointsEachSide

Smoothing filter points each side

5

sggPolynomialOrder

Smoothing filter polynomial order

5

This filter needs as input:

  • Optical tracking source

  • IMU source

This Filter outputs:

  • fusedPose

Output Data Format

Parameter name

Description

Unit

Parameter name

Description

Unit

lastDataTime

Unused

s

orientation

Orientation quaternion

without unit

position

Unused

m

timestamp

Time of data acqusition

ns

Source Options

Optical Tracking Source Options

Advanced Realtime Tracking (ART)

FusionHub works with all ART tracking systems, based on their DTrack tracking software.

Optitrack

FusionHub works with all Optitrack tracking systems based on their Motive tracking software.

VICON

FusionHub consumes VICON’s DataStream protocol. Communication has been tested with their Shogun software.

Antilatency

FusionHub connects directly to Antilatency’s USB or wireless trackers.

IMU Source

FusionHub supports all LP-RESEARCH IMUs.

See a description of how to prepare LPMS-IG1 for operation with FusionHub further below.

LPMS-IG1

LPMS-CURS3

Graphical User Interface

Dashboard

3D Viewer

Sensor Fusion Configuration and Calibration Status

General Settings

MOVE Filter Configuration

FusionHub MOVE adds an additional platform IMU to the BASE configuration. It combines data from both IMUs to calculate poses relative to a moving platform.

The MOVE filter section of FusionHub is still under development. Refer to LPVR-DUO for an implementation of the filter for specific virtual / augmented reality headsets.

FLOW Filter Configuration

FusionHub FLOW combines odometry, GPS and IMU data from a vehicle to calculate high-accuracy and low-latency global localization information. While GPS or RTK-GPS measurements alone provide similar positioning accuracy the output frequency of these systems is relatively low, making them unsuitable for applications where localization information at higher framerates is required, such as positioning objects in an augmented reality environment.

By additionally using odometry (wheel speeds, steering angle etc.) information, the localization data from the GPS measurements is interpolated to achieve framerates limited only by IMU and odometry sampling speeds.

The FLOW filter has two operation modes with different configuration blocks in config.json and different output formats. The two modes are:

  • Low-dynamics filter (LD)

  • High-dynamics filter (HD)

The diagram below shows an overview of a simple FLOW filter setup.

Installation of Hardware Components

Inertial Measurement Unit (IMU)

LPMS-IG1P needs to be installled in the vehicle in a known orientation ideally with the coordinate axes of the IMU arranged in parallel to the vehicle coordinate system. As vehicle reference frame we are using the VW coordinate system as shown in the image below. Connect the USB connector of LPMS-IG1P to the host computer. If needed an active or passive USB extension can be used. Make sure to check data integrity with the LpmsControl 2 data acquisition tool, we have noticed communication issues with some passive USB extensions.

Global Positioning System (GPS)

The GPS receiver is integrated with the LPMS-IG1P sensor. Connect the antenna cable and place the GPS antenna on top of the vehicle.

Alternatively, a standalone RTK gps module can be used as a gps input source as well.

CAN Bus Connection

FusionHub can be connected to the vehicle CAN bus by using one of the following CAN bus interfaces:

Low-dynamics Filter (Odometry + GPS + (some) IMU)

Configuration Block

Node name: vehicularFusion

Parameter name

Description

Default

Parameter name

Description

Default

echoFusedPose

fusedVehiclePose output is printed to command line

false

endpoint

Output port for the fusion result

8801

fitModel

Model to use for fusion. At the moment only SimpleCarModel is supported.

SimpleCarModel

driveModel

Model used to calculate the car trajectory from CAN bus data. If the steering wheel data and steering model are provided, Ackermann model can be used.

Differential

velError

Velocity error for Kalman filter. Keep default value.

0.277777778

omegaError

Omega error for Kalman filter. Keep default value.

0.5

measurementError

Measurement error for Kalman filter. Keep default value.

0.1

smoothFit

Enable this option to prevent filter output from jumping between odometry data and GPS measurement. Keep enabled.

true

useImuTurnRate

If enabled the IMU turn rate is used instead of the wheel velocity based turn rate. Recommended.

false

imuTurnRateAxis

The IMU axis to use for the Turn rate if useImuTurnRate is enabled.

1, 0, 0

This filter needs as input:

  • LPMS-IG1P data source for IMU and GPS data

Parameter name

Description

Default

Parameter name

Description

Default

type

Type of GPS receiver. Currently only DualRtk is allowed.

DualRTK

name

The name of the LPMS-IG1P sensor used in this setup. This parameter is optional. If FusionHub is operated at the same time with LPVR-DUO, we recommend specifying the sensor name. Look up the sensor name in LpmsControl 2.

n/a

autodetectType

Type of sensor to be autodetcted

ig1p

rtcm

Set to true if RTCM input is to be received eg. from an NTRIP source.

false

imuEndpoint

Output endpoint of IMU data. This parameter is optional.

tcp://*:8802

  • Alternatively for case with separate IMU and RTK GPS sources (with NTRIP Caster for RTK correction)

RTCM Source

Parameter name

Description

Default

Parameter name

Description

Default

type

Type of RTCM correction data source. Currently only NTRIP is allowed.

NTRIP

host

NTrip caster host.

192.168.1.1

port

NTrip caster port.

2101

mountpoint

NTrip mountpoint or stream to receive rtcm correction data.

 

user

NTrip caster username.

 

password

NTrip caster password.

 

userAgent

Name of user agent when connecting to NTrip caster.

LPVR-POS

initialLatitude

Latitude to forward to Ntrip caster on first connect.

0.0

initialLongitude

Longitude to forward to Ntrip caster on first connect.

0.0

forwardGnss

Set true if gnss data from gnss source is to be forwarded to NTRIP caster. This is useful if Ntrip caster offers dynamic switching of RTCM correction data based on forwarded location.

false

GNSS Source

Parameter name

Description

Default

Parameter name

Description

Default

type

Data output format for gnss data source. Currently only NMEA is allowed.

NMEA

port

Serial port number for gnss source.

 

baudrate

Serial port baudrate to connect to gnss source.

 

rtcm

Set true to enable RTCM correction data forwarding from RTCM source to gnss module.

false

  • CAN bus and vehicle decoder source

Parameter name

Description

Default

Parameter name

Description

Default

type

Type of vehicle. Currently only Automotive allowed.

Automotive

vehicleStateEndpoint

Endpoint for vehicle state output

tcp://*:8999

canInterface

CAN interface used for readin odometry data. Allowed options:
- PeakCAN
- Vector

PeakCAN

vehicleType

Type of vehicle. Currently supported vehicles have to be manually added. Contact us for details.

R56 (BMW Mini)

This Filter outputs:

  • fusedVehiclePose

Output Data Format

Parameter name

Description

Unit

Parameter name

Description

Unit

acceleration

3D acceleration vector as measured by IMU. Describes the orientation of the vehicle in the vehicle coordinate system.

m/s^2

globalPosition

Longitude and latitude in degrees

degrees

lastDataTime

Unused

s

position

Position relative to starting point with X pointing North and Y pointing East in the current UTM frame

m

timestamp

Timestamp of data acquisition

ns

utmZone

UTM zone

UTM string

yaw

Globally referenced yaw angle

rad

Additional Notes

The FusedVehiclePose contains a 3D acceleration vector. The acceleration is defined in the following manner: There's a configuration flag imuToCarRotation which takes a quaternion used to rotate vectors in the IMU frame to the car frame. By default it is the identity quaternion. For the LD model, the measured IMU acceleration is simply rotated by the imuToCarRotation and written to the output.

In the LD filter, pitch and roll has to be derived from the acceleration data based on a model of the stiffness of the chassis. That assumes a flat surface. The HD model offers the full 6-DOF, and we are planning to unify them to have all data available at all times.

As the filter relies heavily on GPS measurements it doesn’t deliver good results indoors. The better GPS reception, the better the resulting output of the filter. The yaw angle of the vehicle is calculated based on several GPS and odometry measurements when the car is moving. Therefore, after starting FusionHub, while the car is static, the filter will not deliver a correct yaw angle. The angle will be adjusted to the correct direction after a few seconds of driving the vehicle.

Example Configuration

Playback and fusion of prerecorded data: gpsImuFusionPlayback.json

Real-time fusion: gpsOdometryFusion.json

High-Dynamics Filter (IMU + GPS)

Node name: gnssImuFusion

Configuration block example (in sinks section)

Parameter name

Description

Default

Parameter name

Description

Default

echoFusedPose

fusedVehiclePose output is printed to command line

false

endpoint

Output port for the fusion result (more than one endpoint can be used if needed, check the endpoint parameters below).

8803

fitModel

Model to use for fusion.

ModelGnssImu

accelError

Acceleration error for Kalman filter. Keep default value.

0.01

omegaError

Omega error for Kalman filter. Keep default value.

0.02

measurementError

Measurement error for Kalman filter. Keep default value.

0.05

imuToCarRotation

Orientation quaternion of IMU relative to car frame

1, 0, 0, 0

smoothFit

Enable this option to prevent filter output from jumping between IMU data and GPS measurement. Keep enabled.

true

singleEndpoint

  • If enabled, the different fusion output messages will be published to the same port "endpoint"

  • If disabled, the "endpoint" parameter above, is the output port for the FusedVehiclePose msg type only.

true

poseEndpoint

Output port for the FusedPose msg type.

8804 if singleEndpoint=false

globalPoseEndpoint

Output port for the GlobalFusedPose msg type.

8805 if singleEndpoint=false

outputRawGnssData

Publishes raw Gnss data position instead of the fusion output. Useful for debugging.

false

outputWhenFilterNotReady

Publishes a temporary raw Gnss data output while the filter is initializing. Useful for a minimal check before moving the vehicle.

false

Setting up the ImuToCarRotation parameter


The used car frame is VW coordinate frame,

The IMU sensor can be mounted in any way but the ImuToCarRotation quaternion need to be provided to transform the IMU data into VW frame.

Example


If the IMU is mounted like follows,

To match the VW frame, we need a 180° rotation around the z axis (clockwise). Therefore, the rotation matrix would be,

And the orientation quaternion woud be [x, y, z, w] = [ 0, 0, 1, 0 ] which can be specified in the configuration like below,

 

This filter needs as input:

  • LPMS-IG1P data source for IMU and GPS data

  • Alternatively for case with separate IMU and RTK GPS sources (with NTRIP Caster for RTK correction)

RTCM Source

Parameter name

Description

Default

Parameter name

Description

Default

type

Type of RTCM correction data source. Currently only NTRIP is allowed.

NTRIP

host

NTrip caster host.

192.168.1.1

port

NTrip caster port.

2101

mountpoint

NTrip mountpoint or stream to receive rtcm correction data.

 

user

NTrip caster username.

 

password

NTrip caster password.

 

userAgent

Name of user agent when connecting to NTrip caster.

LPVR-POS

initialLatitude

Latitude to forward to Ntrip caster on first connect.

0.0

initialLongitude

Longitude to forward to Ntrip caster on first connect.

0.0

forwardGnss

Set true if gnss data from gnss source is to be forwarded to NTRIP caster. This is useful if Ntrip caster offers dynamic switching of RTCM correction data based on forwarded location.

false

GNSS Source

Parameter name

Description

Default

Parameter name

Description

Default

type

Data output format for gnss data source. Currently only NMEA is allowed.

NMEA

port

Serial port number for gnss source.

 

baudrate

Serial port baudrate to connect to gnss source.

 

rtcm

Set true to enable RTCM correction data forwarding from RTCM source to gnss module.

false

CAN bus and vehicle decoder source

This Filter outputs:

  • fusedVehiclePose (2D pose): Output equivalent to the LD filter output. Includes position in meters relative to starting point, global position (lon, lat) and heading.

  • fusedPose (3D pose): relative to starting point, x, y (in meters) + z (height) + 3D orientation quaternion

  • globalFusedPose: globally referenced 3D position (longitude, latitude, height) + 3D orientation quaternion in ENU frame

Output data format

FusedVehiclePose

Parameter name

Description

Unit

Parameter name

Description

Unit

acceleration

3D acceleration vector as measured by IMU. Describes the orientation of the vehicle.

m/s^2

globalPosition

Longitude and latitude in degrees

degrees

lastDataTime

Unused

s

position

Position within UTM zone

m

timestamp

Timestamp of data acquisition

ns

utmZone

UTM zone

UTM string

yaw

Globally referenced yaw angle

rad

FusedPose

Parameter name

Description

Unit

Parameter name

Description

Unit

lastDataTime

Unused

s

orientation

Orientation quaternion in ENU coordinate frame

without unit

position

X, y position + height

m

timestamp

Time of data acqusition

ns

GlobalFusedPose

Parameter name

Description

Unit

Parameter name

Description

Unit

orientation

Orientation quaternion

without unit

position

Longitude, latitude, height

deg, deg, m

timestamp

Time of data acqusition

ns

Information regarding the ENU coordinate system is here: Local tangent plane coordinates

Example Configuration

Playback and fusion of prerecorded data: gpsImuFusionPlayback.json

Data Playback [DEPRECATED, switch over to ReplayExecutable]

Data from a log file can played back and forwarded to a fusion filter using the fileReader block. An example of how to use this node we are showing below:

Parameter name

Description

Unit

Parameter name

Description

Unit

filename

Name of the file to be played back

n/a

playbackInterval

Time interval between each line of the playback file

s

Replay Node [DEPRECATED, switch over to ReplayExecutable]

Replay data from disk file.

Key

Description

Type

Example value

Key

Description

Type

Example value

filepath

Path to read in file

String

“log.json”

replaySpeed

Speed to the actual recording

Double

1

readMultipleLines

Number of lines to read each time

Integer

10

Replay Executable

This is a separate executable that can be built from FusionHub project. In Visual Studio build target dropdown there will be an option to build ReplayExecutable.exe.

The replay executable will read in from file, push data to replay queue and send them to the network (tcp://localhost:9921 by default). To run the ReplayExecutable,

Key

Description

Type

Example value

Key

Description

Type

Example value

-r

Path to read in file

String

“log.json”

--replay-speed

Speed to the actual recording

Double

1

--queue-size

The size of queue that file reader would stop pushing new data to the replay queue.

Increase this value when you see lots of data is published at the same time when running with --verbose

Integer

100

--echo-data

Listen to the publishing endpoint and display the replayed data

N/A

N/A

--verbose

Print the debugging information, i.e., the timestamp a packet is added to the replay queue, replayed from the replay queue, and discarded from the replay queue.

N/A

N/A

A normal FusionHub program can then receive the file data by having an endpoints source defined in the configuration file:

 

Graphical User Interface

Map View

Data Playback and Recording

Data playback and recording works in the same way for all FusionHub versions. It has been described in the previous chapters, but I’ll add a recap here to give it its dedicated chapter, as it’s a very important feature for data analysis and serialization.

Data Recording

Record node

You can record the output from FusionHub to a file by adding

to the sink section of config.json.

File Logger

Data Playback

Data from a log file can played back and forwarded to a fusion filter using the fileReader block. An example of how to use this node we are showing below:

Parameter name

Description

Unit

Parameter name

Description

Unit

filename

Name of the file to be played back

n/a

playbackInterval

Time interval between each line of the playback file

s

Communication with External Applications

WebSocket APIs

Apart from manual editing the config.json configuration script or modifying it through the GUI, FusionHub also offers a WebSocket API for external application to change its configuratuion. In fact the GUI uses this interface to access FusionHub’s settings.

Note that the websocket communication is currently not encrypted, it is not secure. Please take your own precautions to make sure network traffic for the configuration isn’t intercepted in some way. We might add an option for secure communication in future releases.

The WebSocket server can be accessed via 19358 port on the machine hosting the FusionHub service. To accelerate development download the Simple WebSocket Client Chrome plugin. This allows you to manually enter API commands and check the replies from the server.

Command

Sample Requests

Sample Response / Description

Command

Sample Requests

Sample Response / Description

getConfig

Gets in memory configurations.

getSavedConfig

Gets on disk configurations.

saveConfig

Saves the in-memory configurations to the disk.

setConfig

Updates in-memory configurations. This API creates new key-value pairs, or updates the existing values. It doesn't save configurations to disk.

Note that in "data" you just need to specify the path to the json key to update: the example on the right would change the port to 5005 while everything else is left unchanged.

setConfigJsonPath

(available for JVC branch)

Updates the in-memory configuration given JSON path, and new value.

For more info about JSON path, please refer JSON Pointer - JSON for Modern C++ (nlohmann.me).

overwriteConfig

(disabled in JVC branch)

Overwrites the in-memory configurations. This is suitable when user want to remove a key from the configuration.

getIntercalibrationStatus

Gets the current intercalibration status. Useful for refetching current status when the frontnend accidentally disconnects.

applyIntercalibrationResults

Applies the current intercalibration quaternion to the in-memory copy of config. This does not save to disk.

restartBackend

Restarts the backend. Internally the while loop reset the DataBlock, causing all sources and sinks to be freed from memory, and instantiate them again.

startRecording

Listens to data published to endpoints, and write to a file of YYYYMMDD-HHMMSS-{comment}.{format}

stopRecording

Stops the current recording.

listRecording

Lists the recorded filenames since the FusionHub booted up.

getVersion

Sending FusionHub Data to External Applications via the ZeroMQ Interface

FusionHub emits data resulting from the sensor fusion through the local network interface.

Output Ports

The network port that this information is output to can be configured in the JSON parameter file config.json of FusionHub.

Data Format

As low level protocol to emit the output data we use ZeroMQ (publisher / subscriber). The data itself is in JSON format and is encoded as Protocol Buffers. Protocol Buffers are documented here. Message are defined in the Protobuf (.protoc) format as defined in the file stream_data.proto. This file is contained in the installation folder of FusionHub.

Python Resources

Download a Python example that shows how to decode messaged from FusionHub from this repository.

Prerequisites can be installed in your Python 3 environment with this:

Make sure to set the input port in FusionHubPythonExample.py correctly. For example for the Antilatency source definition like below, the port needs to be set to 8899.

C# Resources

On parsing Protobuf files: https://github.com/5argon/protobuf-unity

How to subscribe to ZeroMQ messages: https://github.com/gench23/unity-zeromq-client and ZeroMQ in Unity - UQIDO TECH

VRPN Output

VRPN output is set in the following part in the sinks section of config.json. The device name will be referenced by the plugin for Unreal engine.

Please see below how we achieve data input via VRPN in the Unreal engine. First, install the VRPN LiveLink plugin:

Configure the VRPN source with the correct device and subject name:

Apply the output from the fusion hub to an Unreal object eg. a cine camera actor.

Hardware Preparation

Inertial Measurement Units

General documentation for LPMS IMUs is here.

Switching LPMS-IG1(P) to USBxpress Mode

Note: These instructions work for LPMS-IG1 (IMU only) and LPMS-IG1P (IMU + GPS).

First, download LpmsControl 2 from here and install it.

Connect LPMS-IG1(P) to your computer and start LpmsControl 2.

In LpmsControl 2 select one of the LPMS-IG1(P) sensors and connect to it.

In case the sensor is in VCP (virtual COM port) mode as shown below, click on Convert to switch the sensor to USBxpress mode. This is required for communication with FusionHub.

After converting the sensor to USBxpress mode it should be displayed as such.

The image below shows typical output from LPMS-IG1(P) after connecting.

Close LpmsControl 2 to disconnect from the sensor. You are now ready to use LPMS-IG1(P) in FusionHub.

Optical Tracking Systems

Coming soon.

FusionHub on OpenXR HMDs

General

To make the tracking functionality of FusionHub available to standalone augmented and virtual reality headsets, it can be integrated with Android-compatible OpenXR HMDs. This works via a customized version of the ALVR / ALXR open source projects. ALVR allows streaming image data wirelessly from a host computer and interfaces to 3D content engines through SteamVR. While the original ALVR client was built to work on Meta Quest HMDs, the OpenXR version of ALVR called ALXR works in principle on any OpenXR compatible headset.

We use a thin client library to receive IMU data from the HMD API, pass it to FusionHub, process it there and then re-inject the information into the video pipeline of the headset. Depending on the type of HMD this happens within the ALVR client’s standard interface or in a separate hardware-specific API layer.

The overall system consists of several applications running at the same time. As the development of this application is still work-in-progress, starting and configuring this solution can be a bit cumbersome. We are working on making the process easier as we move along.

See the illustration below for a block diagram of the overall system:

System Components

Applications

The following applications need to be started on the head mounted display and the host computer. They should all the included in the installation package that you received from us. We will discuss the order of starting these applications and what their status output should be below.

On the headset:

Application

Purpose

Name

Application

Purpose

Name

FusionHub server

  • Receives IMU data from ALVR client

  • Receives optical tracking data

  • Sends sensor fusion result to ALVR client

FusionHub-v1.2-Launcher.apk

 

ALVR client

  • Receives image data from ALVR server

  • Connects to FusionHub server

  • Sends pose information to ALVR server

FusionHub-v1.2-ALVR-Client.apk

On the host computer:

Application

Purpose

Name

Application

Purpose

Name

FusionHub GUI

  • Connects to FusionHub server on HMD

  • Configures FusionHub

  • Authenticates system

lp-fusionhub-dashboard.exe

ALVR server

  • Receives pose information from client

  • Sends pose information to SteamVR

  • Sends image data to ALVR client

  • Receives image information from SteamVR

ALVR Launcher.exe

Authentification

FusionHub authentificates itself via the GUI client application. In order to run the client application make sure to insert the LPVR USB dongle into the host computer. After the GUI client is connected to FusionHub on the HMD, FusionHub will start streaming pose data to ALVR.

Running the Solution

Installation

Install the FusionHub APK and ALVR client APK on the headset using a side-loading tool like Sidequest. In case of a Meta Quest HMD this will require you to put the HMD into developer mode.

The FusionHub GUI client and ALVR server can be started on the host PC without further installation, they can be run right out of the deployment folder.

Meta Quest 2 / Pro

Install the SideQuest client Advanced Installer that allows you to sideload APK files to your HMD. The headset needs to be in developer mode. Follow the instructions the SideQuest client shows you or refer to this page.

VIVE Focus 3

Coming soon.

nReal Glasses

Coming soon.

Start-up

FusionHub

  • Start FusionHub on the HMD. A window showing the FusionHub console output should open.

  • Start the FusionHub GUI client on the host computer

  • Connect the GUI client to FusionHub on the HMD. Make sure HMD and host are in the same subnet. Enter the the correct IP of the HMD in the client before pressing connect.

  • Adjust parameter blocks as needed. Refer to the description of FusionHub BASE for configuration options. Note the following input and output ports that are hard-coded in the ALVR FusionHub API layer. These are already correctly set in the default configuration file installed with the FusionHub APK, so usually there is no need to change them.

Endpoint

Direction

Purpose

Endpoint

Direction

Purpose

tcp://*:8799

Output

Fused pose data

tcp://localhost:8898

Input

IMU data

  • If it’s not running yet make sure to start and configure your optical tracking system. Once optical data is streamed to FusionHub, the nOptical counter in the GUI should be increasing.

ALVR

  • Start the ALVR server on the host. While the ALVR server starts up, it will automatically run SteamVR.

  • Start the ALVR client on the HMD. The HMD should be shown in the list of ALVR clients in the ALVR server application. In some cases you need to click the Trust button in the application to start streaming.

  • Once streaming starts, you should see the SteamVR default envinronment through the headset. Check if the nIMU counter in the FusionHub GUI is increasing. If both nOptical and nIMU are increasing then the communication between ALVR, optical tracking and FusionHub is working.

  • You can now use and calibrate the system as described in the FusioHub BASE section.

Optical Tracking Systems

Marker Adjustment

The optical system is the tracking reference, its pose is what is received by the visualization backend. The orientation of the IMU sensor is calibrated relative to the optical markers on the HMD. Therefore it is important to set up the tracking body or rigid body in the optical tracking software (DTrack, Motive etc.) in a way that its axes align with the optical axes of the head mounted display.

We will add to this section soon. In the meantime refer to this page for ART setups and this page for OptiTrack setups from the LPVR documentation.

Optitrack Notes

In order to avoid excessive buffering and data loss, make sure to reduce the amount of data being streamed from Motive. We recommend the streaming settings below. Make sure to set the local interface IP to the IP of the network connection that is being used for communicating with the HMD. It is a common mistake to not set this IP correctly. If the correct IP doesn’t show up, restart Motive.

Troubleshooting

The best way to diagnose what is going wrong with the system if something doesn’t work as expected is to look at the log output of FusionHubLauncher. The log can be recorded by connecting your HMD to a PC using a USB (most likely USB-C) cable.

For this purpose download the Android platform tools for Windows from here: SDK Platform Tools release notes  |  Android Studio  |  Android Developers

Copy the platform tools files to a folder and open a command prompt in that folder.

Check if the HMD is detected by your computer by entering adb devices. If your devices is not detected or marked as unauthorized make sure you have the correct USB driver for your HMD installed and acknowledged USB access from the host computer in the Android GUI on the HMD.

Enter adb logcat | findstr fusionhub to stream log data from the device to your command line. Only data from the FusionHubLauncher application will be displayed. The initialization log will be displayed when FusionHubLauncher is first started and everytime you restart it using the Restart button in the GUI.

Release Notes

Version 1.2

Release date: 2023/1/5

  • GUI as standalone application

  • Support for LPMS-CURS3 and other series 3 sensors as input source (BASE and MOVE)

  • Added GPS-IMU filter (FLOW)

  • More example configurations

  • Added sample data for vehicle localization

  • Various bug fixes and smaller modifications

Version 1.1

Release date: 2022/11/21

  • New graphical user interface

  • Added GPS-odometry fusion for automobile localization

Version 1.0

Release date: 2022/8/25

  • First full release

  • Added IMU-optical fusion