...
FusionHub is a software application that has the purpose of combining various a number of sensor data inputs to create a higher level data information output. There are 3 basic versions of FusionHub:
FusionHub BASE combines data from an outside-in tracking system with inertial measurements via an IMUdone by an inertial measurement unit (IMU). Typical applications: Head-mounted display tracking for VR/AR applications, camera tracking for virtual production
FusionHub MOVE adds an additional platform IMU to the BASE configuration. It combines data from both IMUs to calculate poses relative to a moving platform. Typical applications: AR/VR in a vehicle, aircraft, or on a simulator platform
FusionHub FLOW combines odometry, GPS and IMU data from a vehicle to calculate high-accuracy and low-latency global localization information. Typical applications: Automobile localization, robot localization
...
Setup your optical tracking system. Attach the IMU to the optical target or attach both to the same rigid object eg. an HMD. Initialize the optical tracking body in your motion capture software and note the object ID.
Connect your IMU to the computer running FusionHub. Make sure your computer can connect to the IMU and read data by using LpmsControl 2. Make sure to disconnect from LpmsControl before running FusionHub.
Modify
config.json
to contain the correct information for your IMU and optical tracking system. See below how to configure the various block of blocks in the configuration file. The configuration file can also be modified through the FusionHub GUI as shown further below.
...
If all components are connected and the configuration file is correctvalid, FusionHub should work right away after starting the application. The console output shows a log of the initialization of the various components. Note that you can log the output from FusionHub to a file by adding
...
Gyroscope sensors have a built-in measurement bias that changes over time and is temperature-dependent. Good, permanent temperature calibration of MEMS gyroscopes is usually not possibklehard to achieve, therefore FusionHub offers teh the possibility to run-time calibrate this offset. This calibration is semi-automatic.
The measurement bias of the gyroscope attached to the tracked object is calculated as an average of the data acquired over a certain time interval. Requirement for this sampling to happen is for the object to be in a non-moving / static state. The state of the object is determined by the input data from the optical tracking. So once the optical tracking system (eg. ART DTrack) reports the optical target to be static, gyroscope data will be sampled, averaged and a new bias compensation value vector calculated.
The result of the autocalibration is saved in autocalibValue.json
. When starting FusionHub for the first time, this offset is set to (0, 0, 0)
. Make sure to place the target, with the IMU attached, within the tracking volume and keep it static eg. by placing putting it on the floor.
IMU-Optical Intercalibration
The IMU-optical intercalibration calibrates the orientation difference between IMU and the optical tracking body. When setting up a new system or after modifying the optical target a (re-)calibration is needed. The calibration is started by running FusionHub with the runIntercalibration
option set to true
.
...
Click Apply Intercalibration Result
to automatically insert the result in into the configuration file. Click Set
and Save
at the bottom of the editor to save the result and restart FusionHub.
Check the 3D View
page to confirm if the intercalibration result is correct. The red and white cube mostly should overlap alpmost exactly at all times when you rotate your object inside the tracking volume. Note that after a restart it might take a few seconds for optical and fused pose to converge.
...
FusionHub supports all LP-RESEARCH IMUs.
See a description on of how to prepare LPMS-IG1 for operation with FusionHub further below.
...
FusionHub MOVE adds an additional platform IMU to the BASE configuration. It combines data from both IMUs to calculate poses relative to a moving platform.
The MOVE filter part section of FusionHub is still under development. Refer to LPVR-DUO for an implementation of the filter for specific virtual / augmented reality headsets.
...
LPMS-IG1P needs to be installled in the vehicle in a known orientation ideally with the coordinate axes of the the IMU arranged in parallel to the vehicle coordinate system. As a vehicle reference frame we are using the SAE coordinate system as shown in the image below. Connect the USB connection connector of the sensor with the LPMS-IG1P to the host computer. If needed an active or passive USB extension can be used. Make sure to check data integrity with the LpmsControl 2 data acquisition tool, we have noticed communication issues with some passive USB extensions.
...
Global Positioning System (GPS)
The GPS receiver is intergrated integrated with the LPMS-IG1P sensor. Connect the antenna cable and place the GPS antenna on top of the vehicle.
...
Parameter name | Description | Default |
---|---|---|
echoFusedPose | fusedVehiclePose output is printed to command line | false |
endpoint | Output port for the fusion result | 8801 |
fitModel | Model to use for fusion. At the moment only | SimpleCarModel |
driveModel | Model to use used to calculate the car trajectory from CAN bus data. At the moment only | Differential |
velError | Velocity error for Kalman filter. Keep default value. | 0.277777778 |
omegaError | Omega error for Kalman filter. Keep default value. | 0.5 |
measurementError | Measurement error for Kalman filter. Keep default value. | 0.1 |
smoothFit | Enable this option to prevent filter output from jumping between odometry data and GPS measurement. Keep enabled. | true |
...
Parameter name | Description | Default |
---|---|---|
type | Type of GPS receiver. Currently only | DualRTK |
name | The name of the LPMS-IG1P sensor used in this setup. This parameter is optional. If FusionHub is operated at the same time with LPVR-DUO, we recommend specifying the sensor name. Look up the sensor name in LpmsControl 2. | n/a |
autodetectType | Type of sensor to be autodetcted | ig1p |
rtcm | Set to true if RTCM input is to be received eg. from an NTRIP source. | false |
imuEndpoint | Output endpoint of IMU data. This parameter is optional. | tcp://*:8802 |
...
Parameter name | Description | Unit |
---|---|---|
acceleration | 3D acceleration vector as measured by IMU. Describes the orientation of the vehicle in the vehicle coordinate system. | m/s^2 |
globalPosition | Longitude and latitude in degrees | degrees |
lastDataTime | IgnoreUnused | s |
position | Position relative to starting point with X pointing North and Y pointing East in the current UTM frame | m |
timestamp | Timestamp of data acquisition | ns |
utmZone | UTM zone | UTM string |
yaw | Globally referenced yaw angle | rad |
...
Parameter name | Description | Default |
---|---|---|
echoFusedPose |
| false |
endpoint | Output port for the fusion result | 8801 |
fitModel | Model to use for fusion. At the moment only | SimpleCarModel |
accelError | Model to use used to calculate car trajectory from CAN bus data. At the moment only | Differential |
omegaError | Omega error for Kalman filter. Keep default value. | 0.5 |
measurementError | Measurement error for Kalman filter. Keep default value. | 0.1 |
imuToCarRotation | Orientation quaternion of IMU relative to car frame | 1, 0, 0, 0 |
...