Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 22 Next »

Introduction

FusionHub is a software application that has the purpose of combining various sensor data inputs to create a higher level data output. There are 3 basic versions of FusionHub:

  • FusionHub BASE combines data from an outside-in tracking system with inertial measurements via an IMU. Typical applications: Head-mounted display tracking for VR/AR applications, camera tracking for virtual production

  • FusionHub MOVE adds an additional platform IMU to the BASE configuration. It combines data from both IMUs to calculate poses relative to a moving platform. Typical applications: AR/VR in a vehicle, aircraft, or on a simulator platform

  • FusionHub FLOW combines odometry, GPS and IMU data from a vehicle to calculate high-accuracy and low-latency global localization information. Typical applications: Automobile localization, robot localization

The diagram below shows the general structure of FusionHub. Sources and sinks are connected by a filter unit. The sensor fusion functionality is contained in this filter unit. The filter parameters as well as the parameters of input and output blocks can be configured via a configuration script or the graphical user interface.

The graphical user interface is detached from the main FusionHub application and both applications can therefore run on separate computers. This provides flexibility for running FusionHub on devices with limited monitoring capabilities like a head mounted display.

General

Starting FusionHub

FusionHub consists of two components:

  • The main application

  • A graphical user interface application

Insert the security dongle into a USB port of your computer.

The main FusionHub application is started by running FusionHub.exe. No specific installation is needed, the application can be run directly out of its deployment directory. It is a command line application that uses the file config.json for its configuration. We will explan the contents and options of the configuration file further below.

Please install the graphical user interface by running lp-fusionhub-dashboard_0.1.0_x64_en-US.msi. It installs lp-fusionhub-dashboard in your start menu, launch the application from there. Press the Connect button after starting FusionHub.exe to connect client and server. In case you are running FusionHub on a separate machine make sure to enter the correct IP address.

The screenshot below shows the connection elements of the GUI.

BASE Filter Configuration

FusionHub BASE combines data from an outside-in tracking system with inertial measurements via an IMU.

Setup

  • Setup your optical tracking system. Attach the IMU to the optical target or attach both to the same rigid object eg. an HMD. Initialize the optical tracking body in your motion capture software and note the object ID.

  • Connect your IMU to the computer running FusionHub. Make sure your computer can connect to the IMU and read data by using LpmsControl 2. Make sure to disconnect from LpmsControl before running FusionHub.

  • Modify config.json to contain the correct information for your IMU and optical tracking system. See below how to configure the various block of the configuration file. The configuration file can also be modified through the FusionHub GUI as shown further below.

Operation

If all components are connected and the configuration file is correct, FusionHub should work right away after starting the application. The console output shows a log of the initialization of the various components. Note that you can always log the output from FusionHub to a file by adding

"record": {
    "filename": "log.a",
    "format": "json"
}

to the sink section of config.json.

After starting and connecting the GUI the Auto Calibration section of Fusion Config should show increasing numbers for nImu (number of recorded IMU samples) and nOptical (number of recorded optical samples).

Calibration

There are two calibration steps that are required to operate the BASE filter:

Gyroscope Autocalibration

Gyroscope sensors have a built-in measurement bias that changes over time and is temperature-dependent. Good temperature calibration of MEMS gyroscopes is usually not possibkle, therefore FusionHub offers teh possibility to run-time calibrate this offset. This calibration is semi-automatic.

The measurement bias of the gyroscope attached to the tracked object is calculated as an average of data acquired over a certain interval. Requirement for this sampling to happen is for the object to be in a non-moving / static state. The state of the object is determined by the input from the optical tracking. So once the optical tracking system (eg. ART DTrack) reports the optical target to be static, gyroscope data will be sampled, averaged and a new bias compensation value calculated.

The result of the autocalibration is saved in autocalibValue.json. When starting FusionHub for the first time, this offset is set to (0, 0, 0). Make sure to place the target with the IMU attached within the tracking volume and keep it static eg. by placing it on the floor.

IMU-Optical Intercalibration

The IMU-optical intercalibration calibrates the orientation difference between IMU and the optical tracking body. When setting up a new system or after modifying the optical target a (re-)calibration is needed. The calibration is started by running FusionHub with the runIntercalibration option set to true.

Rotate the target with the IMU attached slowly within the tracking volume. You can monitor the status of the intercalibration in the Intercalibration section on the Fusion Config page of the GUI. After around 50 sampled poses the intercalibration should be finished and the GUI should show the resulting calibration quaternion.

Click Apply Intercalibration Result to automatically insert the result in the configuration file. Click Set and Save at the bottom of the editor to save the result and restart FusionHub.

Check the 3D View page to confirm if the intercalibration result is correct. The red and white cube mostly overlap when you rotate your object inside the tracking volume. Note that after a restart it might take a few seconds for optical and fused pose to converge.

IMU-Optical Fusion Filter

Configuration Block

Node name: fusion

"fusion": {
    "type": "ImuOpticalFusion",
    "settings": {
        "echoFusedPose": false,
        "echoOpticalPose": true,

        "runIntercalibration": true,

        "Autocalibration": {
            "minAgeS": 60.0,
            "nSamplesForAutocalibration": 1500,
            "nSamplesForSteady": 256,
            "noiseRmsLimit": 0.02,
            "steadyThresholdAverage": 0.2,
            "steadyThresholdRms": 1.0
        },

        "MotionDetection": {
            "omegaLimit": 2.0,
            "positionSampleInterval": 1000,
            "rotationFilterAlpha": 0.9,
            "timeToUnknown": 500
        },

        "SensorFusion": {
            "alignment": {
                "w": 1.0,
                "x": 0.0,
                "y": 0.0,
                "z": 0.0
            },

            "orientationWeight": 0.005,
            "tiltCorrection": null,
            "yawWeight": 0.01
        }
    }
}

Parameter name

Description

Default

type

Type of sensor fusion. At the moment only default option possible.

ImuOpticalFusion

echoFusedPose

Print fused pose like it is output

false

echoOpticalPose

Print optical pose like it is received by fusion

false

runIntercalibration

Starts intercalibration between IMU and optical target

true

minAgeS

Minimum time between two autocalibrations

60.0

nSamplesForAutocalibration

Number of samples used by autocalibration

1500

nSamplesForSteady

Number of samples needed below threshold to trigger calibration

256

noiseRmsLimit

Noise limit

0.02

steadyThresholdAverage

Threshold average limit

0.2

steadyThresholdRms

Threshold RMS limit

1.0

omegaLimit

Omega limit

2.0

positionSampleInterval

Interval between two position samples for motion detection

1000

rotationFilterAlpha

Weight for rotation low-pass filter

0.9

timeToUnknown

Interval to autocalibration “unknown” state

500

alignment

Alignment quaternion between IMU and optical target. Insert the result of the intercalibration here.

1, 0, 0, 0

orientationWeight

Amount of correction of angle calculated from gyroscope data by optical measurements (roll, pitch, yaw)

0.005

tiltCorrection

Specify for correcting tilt of angle calculated from gyroscope data by vertical calculated from gravity measurements. This feature is not available yet.

null

yawWeight

Amount of yaw correction by optical data, if tilt correction is active

0.01

This filter needs as input:

  • Optical tracking source

  • IMU source

This Filter outputs:

  • fusedPose

Output Data Format

{
	"fusedPose": {
		"lastDataTime": {
			"timestamp": 0
		},
		"orientation": {
			"w": 1.0,
			"x": 0.0,
			"y": 0.0,
			"z": 0.0
		},
		"position": {
			"x": 0.0,
			"y": 0.0,
			"z": 0.0
		},
		"timestamp": {
			"timestamp": 0
		}
	}
}

Parameter name

Description

Unit

lastDataTime

Unused

s

orientation

Orientation quaternion

without unit

position

Unused

m

timestamp

Time of data acqusition

ns

Source Options

Optical Tracking Source Options

Advanced Realtime Tracking (ART)

FusionHub works with all ART tracking systems, based on their DTrack tracking software.

"type": "DTrack",
"settings": {
    "port": 5005,
    "bodyID": 3,
    "endpoint": "inproc://optical_data_source_1"
}

Optitrack

FusionHub works with all Optitrack tracking systems based on their Motive tracking software.

"type": "Optitrack",
"settings": {
    "host": "localhost",
    "connectionType": "Multicast",
    "bodyID": 444
}

VICON

FusionHub consumes VICON’s DataStream protocol. Communication has been tested with their Shogun software.

"type": "Vicon",
    "settings": {
    "host": "localhost",
    "subject": "VCam"
}

Antilatency

FusionHub connects directly to Antilatency’s USB or wireless trackers.

"type": "Antilatency",
"settings": {
    "endpoint": "inproc://optical_data_source_1",
    "environmentLink": "AntilatencyAltEnvironmentHorizontalGrid~AgAEBLhTiT_cRqA-r45jvZqZmT4AAAAAAAAAAACamRk_AQQCAwICAgICAQICAAI",
    "placementLink": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"
}

IMU Source

FusionHub supports all LP-RESEARCH IMUs.

See a description on how to prepare LPMS-IG1 for operation with FusionHub further below.

LPMS-IG1

"imu": {
    "type": "OpenZen",
    "settings": {
        "autodetectType": "ig1"
    }
}

LPMS-CURS3

"imu": {
    "type": "OpenZen",
    "settings": {
        "autodetectType": "lpms"
    }
}

Graphical User Interface

Dashboard

3D Viewer

Sensor Fusion Configuration and Calibration Status

General Settings

MOVE Filter Configuration

FusionHub MOVE adds an additional platform IMU to the BASE configuration. It combines data from both IMUs to calculate poses relative to a moving platform.

FLOW Filter Configuration

FusionHub FLOW combines odometry, GPS and IMU data from a vehicle to calculate high-accuracy and low-latency global localization information. The FLOW filter has two operation modes with different configuration blocks in config.json and different output formats. The two modes are:

  • Low-dynamics filter (LD)

  • High-dynamics filter (HD)

The diagram below shows an overview of a simple FLOW filter setup.

Low-dynamics Filter (Odometry + GPS + (some) IMU)

Configuration Block

Node name: vehicularFusion

// Sensor fusion config
"vehicularFusion": {
    "echoFusedPose": false,
    "endpoint": "tcp://*:8801",
    "fuser": {
        "fitModel": "SimpleCarModel",
        "driveModel": "Differential",
        "velError": 0.277777778,
        "omegaError": 0.5,
        "measurementError": 0.1,
        "smoothFit": true
    }
}

Parameter name

Description

Default

echoFusedPose

fusedVehiclePose output is printed to command line

false

endpoint

Output port for the fusion result

8801

fitModel

Model to use for fusion. At the moment only SimpleCarModel is supported.

SimpleCarModel

driveModel

Model to use to calculate car trajectory from CAN bus data. At the moment only Differential is supported.

Differential

velError

Velocity error for Kalman filter. Keep default value.

0.277777778

omegaError

Omega error for Kalman filter. Keep default value.

0.5

measurementError

Measurement error for Kalman filter. Keep default value.

0.1

smoothFit

Enable this option to prevent filter output from jumping between odometry data and GPS measurement. Keep enabled.

true

This filter needs as input:

  • LPMS-IG1P data source for IMU and GPS data

"imuP": {
    "type": "DualRtk",
    "settings": {
        "sensor1": {
            // If specification needed, insert first IG1 sensor name here
            //"name": "ig1p232800650050",
            "autodetectType": "ig1p"
        },
        "rtcm": true,
        "imuEndpoint": "tcp://*:8802"
    }
}
  • CAN bus and vehicle decoder source

"vehicle": {
    "type": "Automotive",
    "vehicleStateEndpoint": "tcp://*:8999",
    "settings": {
        "canInterface": "PeakCAN",
        "vehicleType": "R56"
    }
}

This Filter outputs:

  • fusedVehiclePose

Output Data Format

{
    "fusedVehiclePose": {
        "acceleration": {
            "x": 0.0,
            "y": 0.0,
            "z": 0.0
        },
        "globalPosition": {
            "x": 0.0,
            "y": 0.0
        },
        "lastDataTime": {
            "timestamp": 0
        },
        "position": {
            "x": 0,
            "y": 0
        },
        "timestamp": {
            "timestamp": 0
        },
        "utmZone": "31T",
        "yaw": 0
      }
}

Parameter name

Description

Unit

acceleration

3D acceleration vector as measured by IMU. Describes the orientation of the vehicle.

m/s^2

globalPosition

Longitude and latitude in degrees

degrees

lastDataTime

Ignore

s

position

Position relative to starting point with X pointing North and Y pointing East in the current UTM frame

m

timestamp

Timestamp of data acquisition

ns

utmZone

UTM zone

UTM string

yaw

Globally referenced yaw angle

rad

Note: The FusedVehiclePose contains a 3D acceleration vector. The acceleration is defined in the following manner: There's a configuration flag imuToCarRotation which takes a quaternion used to rotate vectors in the IMU frame to the car frame. By default it is the identity quaternion. For the LD model, the measured IMU acceleration is simply rotated by the imuToCarRotation and written to the output.

In the LD filter, pitch and roll has to be derived from the acceleration data based on a model of the stiffness of the chassis. That assumes a flat surface. The HD model offers the full 6-DOF, and we are planning to unify them to have all data available at all times.

Example Configuration

Playback and fusion of prerecorded data: gpsImuFusionPlayback.json

Real-time fusion: gpsOdometryFusion.json

High-Dynamics Filter (IMU + GPS)

Node name: gnssImuFusion

Configuration block example (in sinks section)

"gnssImuFusion": {
    "echoFusedPose": false,
    "endpoint": "tcp://*:8803",
    "fuser": {
        "fitModel": "ModelGnssImu",
        "accelError": 0.01,
        "omegaError": 0.02,
        "measurementError": 0.05,
        "imuToCarRotation": {
            "w": 1,
            "x": 0,
            "y": 0,
            "z": 0
        }
    }
}

Parameter name

Description

Default

echoFusedPose

fusedVehiclePose output is printed to command line

false

endpoint

Output port for the fusion result

8801

fitModel

Model to use for fusion. At the moment only SimpleCarModel is supported.

SimpleCarModel

accelError

Model to use to calculate car trajectory from CAN bus data. At the moment only Differential is supported.

Differential

omegaError

Omega error for Kalman filter. Keep default value.

0.5

measurementError

Measurement error for Kalman filter. Keep default value.

0.1

imuToCarRotation

Orientation quaternion of IMU relative to car frame

1, 0, 0, 0

This filter needs as input:

  • LPMS-IG1P data source for IMU and GPS data

"imuP": {
    "type": "DualRtk",
    "settings": {
        "sensor1": {
            // If specification needed, insert first IG1 sensor name here
            //"name": "ig1p232800650050",
            "autodetectType": "ig1p"
        },
        "rtcm": true,
        "imuEndpoint": "tcp://*:8802"
    }
}
  • CAN bus and vehicle decoder source

"vehicle": {
    "type": "Automotive",
    "vehicleStateEndpoint": "tcp://*:8999",
    "settings": {
        "canInterface": "PeakCAN",
        "vehicleType": "R56"
    }
}

This Filter outputs:

  • fusedVehiclePose

  • fusedPose

Output data format

{
    "fusedVehiclePose": {
        "acceleration": {
            "x": 0.0,
            "y": 0.0,
            "z": 0.0
        },
        "globalPosition": {
            "x": 0.0,
            "y": 0.0
        },
        "lastDataTime": {
            "timestamp": 0
        },
        "position": {
            "x": 0.0,
            "y": 0.0
        },
        "timestamp": {
            "timestamp": 0
        },
        "utmZone": "31T",
        "yaw": 0.0
    }
}

Parameter name

Description

Unit

acceleration

3D acceleration vector as measured by IMU. Describes the orientation of the vehicle.

m/s^2

globalPosition

Longitude and latitude in degrees

degrees

lastDataTime

Unused

s

position

Position within UTM zone

m

timestamp

Timestamp of data acquisition

ns

utmZone

UTM zone

UTM string

yaw

Globally referenced yaw angle

rad

{
	"fusedPose": {
		"lastDataTime": {
			"timestamp": 0
		},
		"orientation": {
			"w": 1.0,
			"x": 0.0,
			"y": 0.0,
			"z": 0.0
		},
		"position": {
			"x": 0.0,
			"y": 0.0,
			"z": 0.0
		},
		"timestamp": {
			"timestamp": 0
		}
	}
}

Parameter name

Description

Unit

lastDataTime

Unused

s

orientation

Orientation quaternion

without unit

position

Unused

m

timestamp

Time of data acqusition

ns

Example Configuration

Playback and fusion of prerecorded data: gpsImuFusionPlayback.json

Graphical User Interface

Map View

Communication with External Applications

Sending FusionHub Data to External Applications via the ZeroMQ Interface

FusionHub emits data resulting from the sensor fusion through the local network interface.

Output Ports

The network port that this information is output to can be configured in the JSON parameter file config.json of FusionHub.

Data Format

As low level protocol to emit the output data we use ZeroMQ (publisher / subscriber). The data itself is in JSON format and is encoded as Protocol Buffers. Protocol Buffers are documented here. Message are defined in the Protobuf (.protoc) format as defined in the file stream_data.proto. This file is contained in the installation folder of FusionHub.

Python Resources

Download a Python example that shows how to decode messaged from FusionHub from this repository.

Prerequisites can be installed in your Python 3 environment with this:

pip install zmq
pip install protobuf

Make sure to set the input port in FusionHubPythonExample.py correctly. For example for the Antilatency source definition like below, the port needs to be set to 8899.

"optical": {
    "type": "Antilatency",
    "settings": {
        // Use this for access from an external process eg. ALVR
        "endpoint": "tcp://*:8899",
        
        "environmentLink": "AntilatencyAltEnvironmentHorizontalGrid~AgAEBLhTiT_cRqA-r45jvZqZmT4AAAAAAAAAAACamRk_AQQCAwICAgICAQICAAI",
        "placementLink": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"
    }
}

C# Resources

On parsing Protobuf files: https://github.com/5argon/protobuf-unity

How to subscribe to ZeroMQ messages: https://github.com/gench23/unity-zeromq-client and https://tech.uqido.com/2020/09/29/zeromq-in-unity/

VRPN Output

VRPN output is set in the following part in the sinks section of config.json. The device name will be referenced by the plugin for Unreal engine.

"VRPN": {
  "settings": {
    "deviceName": "Fusion Hub"
  }
}	  

Please see below how we achieve data input via VRPN in the Unreal engine. First, install the VRPN LiveLink plugin:

Configure the VRPN source with the correct device and subject name:

Apply the output from the fusion hub to an Unreal object eg. a cine camera actor.

Hardware Preparation

Inertial Mesaurement Units

General documentation for LPMS IMUs is here.

Switching LPMS-IG1(P) to USBxpress Mode

Note: These instructions work for LPMS-IG1 (IMU only) and LPMS-IG1P (IMU + GPS).

First, download LpmsControl 2 from here and install it.

Connect LPMS-IG1(P) to your computer and start LpmsControl 2.

In LpmsControl 2 select one of the LPMS-IG1(P) sensors and connect to it.

In case the sensor is in VCP (virtual COM port) mode as shown below, click on Convert to switch the sensor to USBxpress mode. This is required for communication with FusionHub.

After converting the sensor to USBxpress mode it should be displayed as such.

The image below shows typical output from LPMS-IG1(P) after connecting.

Close LpmsControl 2 to disconnect from the sensor. You are now ready to use LPMS-IG1(P) in FusionHub.

Optical Tracking Systems

  • Room calibration

  • Adding and adjusting targets

Release Notes

Version 1.2

Release date: 2023/1/5

  • GUI as standalone application

  • Support for LPMS-CURS3 and other series 3 sensors as input source (BASE and MOVE)

  • Added GPS-IMU filter (FLOW)

  • More example configurations

  • Added sample data for vehicle localization

  • Various bug fixes and smaller modifications

Version 1.1

Release date: 2022/11/21

  • New graphical user interface

  • Added GPS-odometry fusion for automobile localization

Version 1.0

Release date: 2022/8/25

  • First full release

  • Added IMU-optical fusion

  • No labels