Introduction
FusionHub is a software application that has the purpose of combining various sensor data inputs to create a higher level data output. There are 3 basic versions of FusionHub:
FusionHub BASE combines data from an outside-in tracking system with inertial measurements via an IMU. Typical applications: Head-mounted display tracking for VR/AR applications, camera tracking for virtual production
FusionHub MOVE adds an additional platform IMU to the BASE configuration. It combines data from both IMUs to calculate poses relative to a moving platform. Typical applications: AR/VR in a vehicle, aircraft, or on a simulator platform
FusionHub FLOW combines odometry, GPS and IMU data from a vehicle to calculate high-accuracy and low-latency global localization information. Typical applications: Automobile localization, robot localization
The diagram below shows the general structure of FusionHub. Sources and sinks are connected by a filter unit. The sensor fusion functionality is contained in this filter unit. The filter parameters as well as the parameters of input and output blocks can be configured via a configuration script or the graphical user interface.
The graphical user interface is detached from the main FusionHub application and both applications can therefore run on separate computers. This provides flexibility for running FusionHub on devices with limited monitoring capabilities like a head mounted display.
General
Running FusionHub
FusionHub consists of two components:
The main application
A graphical user interface application
The main FusionHub application is started by running FusionHub.exe
. No specific installation is needed, the application can be run directly out of its deployment directory. It is a command line application that uses the file config.json
for its configuration. We will explan the contents and options of the configuration file further below.
Please install the graphical user interface by running lp-fusionhub-dashboard_0.1.0_x64_en-US.msi
. It installs lp-fusionhub-dashboard
in your start menu, launch the application from there. Press the Connect
button after starting FusionHub.exe
to connect client and server. In case you are running FusionHub on a separate machine make sure to enter the correct IP address.
BASE Filter Configuration
Optical Tracking Source Options
ART
"type": "DTrack", "settings": { "port": 5005, "bodyID": 3, "endpoint": "inproc://optical_data_source_1" }
Optitrack
"type": "Optitrack", "settings": { "host": "localhost", "connectionType": "Multicast", "bodyID": 444 }
VICON
"type": "Vicon", "settings": { "host": "localhost", "subject": "VCam" }
Antilatency
"type": "Antilatency", "settings": { // Use this for access from an external process eg. ALVR //"endpoint": "tcp://*:8899", // Use this for internal access eg. sensor fusion "endpoint": "inproc://optical_data_source_1", "environmentLink": "AntilatencyAltEnvironmentHorizontalGrid~AgAEBLhTiT_cRqA-r45jvZqZmT4AAAAAAAAAAACamRk_AQQCAwICAgICAQICAAI", "placementLink": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" }
FLOW Filter Configuration
The FLOW filter has two operation modes with different configuration blocks in config.json
and different output formats. The two modes are:
Low-dynamics filter (LD)
High-dynamics filter (HD)
Low-dynamics Filter (Odometry + GPS + (some) IMU)
Configuration block example (in sinks
section)
Node name: vehicularFusion
// Sensor fusion config "vehicularFusion": { "echoFusedPose": false, "endpoint": "tcp://*:8801", "fuser": { "fitModel": "SimpleCarModel", "driveModel": "Differential", "velError": 0.277777778, "omegaError": 0.5, "measurementError": 0.1, "smoothFit": true } }
Parameter name | Description | Default |
---|---|---|
echoFusedPose | fusedVehiclePose output is printed to command line | false |
endpoint | Output port for the fusion result | 8801 |
fitModel | Model to use for fusion. At the moment only | SimpleCarModel |
driveModel | Model to use to calculate car trajectory from CAN bus data. At the moment only | Differential |
velError | Velocity error for Kalman filter. Keep default value. | 0.277777778 |
omegaError | Omega error for Kalman filter. Keep default value. | 0.5 |
measurementError | Measurement error for Kalman filter. Keep default value. | 0.1 |
smoothFit | Enable this option to prevent filter output from jumping between odometry data and GPS measurement. Keep enabled. | true |
This filter needs as input:
LPMS-IG1P data source for IMU and GPS data
"imuP": { "type": "DualRtk", "settings": { "sensor1": { // If specification needed, insert first IG1 sensor name here //"name": "ig1p232800650050", "autodetectType": "ig1p" }, "rtcm": true, "imuEndpoint": "tcp://*:8802" } }
CAN bus and vehicle decoder source
"vehicle": { "type": "Automotive", "vehicleStateEndpoint": "tcp://*:8999", "settings": { "canInterface": "PeakCAN", "vehicleType": "R56" } }
This Filter outputs:
fusedVehiclePose
Output data format
{ "fusedVehiclePose": { "acceleration": { "x": -0.4263402493894084, "y": -0.14872631710022688, "z": 9.790632347106932 }, "globalPosition": { "x": 1.8985360999771979, "y": 41.50585830111033 }, "lastDataTime": { "timestamp": 0 }, "position": { "x": 0, "y": 0 }, "timestamp": { "timestamp": 48347424440200 }, "utmZone": "31T", "yaw": 0 } }
Parameter name | Description | Unit |
---|---|---|
acceleration | 3D acceleration vector as measured by IMU. Describes the orientation of the vehicle. | m/s^2 |
globalPosition | Longitude and latitude in degrees | degrees |
lastDataTime | Ignore | s |
position | Position within UTM zone | m |
timestamp | Timestamp of data acquisition | ns |
utmZone | UTM zone | UTM string |
yaw | Globally referenced yaw angle | rad |
High-Dynamics Filter (IMU + GPS)
Node name: gnssImuFusion
Configuration block example (in sinks
section)
"gnssImuFusion": { "echoFusedPose": false, "endpoint": "tcp://*:8803", "fuser": { "fitModel": "ModelGnssImu", "accelError": 0.01, "omegaError": 0.02, "measurementError": 0.05, "imuToCarRotation": { "w": 1, "x": 0, "y": -1, "z": 0 } } }
Parameter name | Description | Default |
---|---|---|
echoFusedPose | fusedVehiclePose output is printed to command line | false |
endpoint | Output port for the fusion result | 8801 |
fitModel | Model to use for fusion. At the moment only | SimpleCarModel |
accelError | Model to use to calculate car trajectory from CAN bus data. At the moment only | Differential |
omegaError | Omega error for Kalman filter. Keep default value. | 0.5 |
measurementError | Measurement error for Kalman filter. Keep default value. | 0.1 |
imuToCarRotation | Orientation quaternion of IMU relative to car frame | 1, 0, -1, 0 |
This filter needs as input:
LPMS-IG1P data source for IMU and GPS data
"imuP": { "type": "DualRtk", "settings": { "sensor1": { // If specification needed, insert first IG1 sensor name here //"name": "ig1p232800650050", "autodetectType": "ig1p" }, "rtcm": true, "imuEndpoint": "tcp://*:8802" } }
CAN bus and vehicle decoder source
"vehicle": { "type": "Automotive", "vehicleStateEndpoint": "tcp://*:8999", "settings": { "canInterface": "PeakCAN", "vehicleType": "R56" } }
This Filter outputs:
fusedVehiclePose
fusedPose
Output data format
{ "fusedVehiclePose": { "acceleration": { "x": 0.0, "y": 0.0, "z": 0.0 }, "globalPosition": { "x": 1.8982356601544925, "y": 41.50544434418204 }, "lastDataTime": { "timestamp": 0 }, "position": { "x": -25.38332083641826, "y": -36.403733501197635 }, "timestamp": { "timestamp": 48910226723400 }, "utmZone": "31T", "yaw": 0.1555754684457767 } }
Parameter name | Description | Unit |
---|---|---|
acceleration | 3D acceleration vector as measured by IMU. Describes the orientation of the vehicle. | m/s^2 |
globalPosition | Longitude and latitude in degrees | degrees |
lastDataTime | Unused | s |
position | Position within UTM zone | m |
timestamp | Timestamp of data acquisition | ns |
utmZone | UTM zone | UTM string |
yaw | Globally referenced yaw angle | rad |
{ "fusedPose": { "lastDataTime": { "timestamp": 0 }, "orientation": { "w": 0.4437907292666558, "x": 0.5659687502206026, "y": -0.4749652416904733, "z": 0.5070869566224411 }, "position": { "x": -25.383320836418306, "y": -36.403733501197166, "z": 163.98272320756405 }, "timestamp": { "timestamp": 48910226723400 } } }
Parameter name | Description | Unit |
---|---|---|
| Unused | s |
| Orientation quaternion | n/a |
position | Unused | m |
timestamp | Time of data acqusition | ns |
Communication with External Applications
Sending FusionHub Data to External Applications via the ZeroMQ Interface
FusionHub emits data resulting from the sensor fusion through the local network interface.
Output Ports
The network port that this information is output to can be configured in the JSON parameter file config.json
of FusionHub.
Data Format
As low level protocol to emit the output data we use ZeroMQ (publisher / subscriber). The data itself is in JSON format and is encoded as Protocol Buffers. Protocol Buffers are documented here. Message are defined in the Protobuf (.protoc) format as defined in the file stream_data.proto
. This file is contained in the installation folder of FusionHub.
Python Resources
Download a Python example that shows how to decode messaged from FusionHub from this repository.
Prerequisites can be installed in your Python 3 environment with this:
pip install zmq pip install protobuf
Make sure to set the input port in FusionHubPythonExample.py correctly. For example for the Antilatency source definition like below, the port needs to be set to 8899
.
"optical": { "type": "Antilatency", "settings": { // Use this for access from an external process eg. ALVR "endpoint": "tcp://*:8899", "environmentLink": "AntilatencyAltEnvironmentHorizontalGrid~AgAEBLhTiT_cRqA-r45jvZqZmT4AAAAAAAAAAACamRk_AQQCAwICAgICAQICAAI", "placementLink": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" } }
C# Resources
On parsing Protobuf files: https://github.com/5argon/protobuf-unity
How to subscribe to ZeroMQ messages: https://github.com/gench23/unity-zeromq-client and https://tech.uqido.com/2020/09/29/zeromq-in-unity/
VRPN Output
VRPN output is set in the following part in the sinks
section of config.json
. The device name will be referenced by the plugin for Unreal engine.
"VRPN": { "settings": { "deviceName": "Fusion Hub" } }
Please see below how we achieve data input via VRPN in the Unreal engine. First, install the VRPN LiveLink plugin:
Configure the VRPN source with the correct device and subject name:
Apply the output from the fusion hub to an Unreal object eg. a cine camera actor.
Release Notes
Version 1.1
Release date: 2022/11/21
New graphical user interface
Added GPS-odometry fusion for automobile localization
Version 1.0
Release date: 2022/8/25
First full release
Added IMU-optical fusion