LPVIZ with Xreal Air 2 Ultra / LPPOS installation instructions
- 1 Principle of operation
- 2 LPVIZ setup
- 3 LPPOS setup (vehicle localization)
- 4 Fusionhub Output
- 5 Recording and Replaying Data
- 5.1 Data Recording
- 5.2 Data Replay
- 5.2.1 Replay executable
- 6 Troubleshooting LPPOS
Principle of operation
LPVIZ setup
Overview
System components
Xreal Air 2 Ultra | USB-C connection cable |
LPMS-IG1 (for installation with ART / Optirack system) |
|
How to connect Xreal Air 2 Ultra to a PC
In order to display graphics on the Xreal Air 2 Ultra the headset needs to be connected to a PC using a USB-C connection with DisplayPort support. Sometimes this is also referred to as Thunderbolt support. The following configurations allow for this type of connection.
Laptop graphics card with USB-C output. This is quite common for laptop graphics cards in order to connect a laptop to an external monitor. For best performance a built-in Nvidia graphics adapter and corresponding USB-C connector is preferred.
External adapter that converts a PC’s USB port and DisplayPort into a USB-C line. Example for this type of adapter: https://www.store.level1techs.com/products/p/dp-repeater-hdmi-splitter-6sha9-yznx5-zm58w
PCI adapter card that converts a PC’s USB port and DisplayPort into a USB-C line as shown below: https://www.asus.com/motherboards-components/motherboards/accessories/thunderboltex-4/
LPVIZ driver setup (headset display driver)
Plug Xreal Air 2 Ultra into a laptop that supports USB-C DisplayPort output.
Press volume down button for 2s to switch HMD to 3D screen mode.
Display setup should look like this, with resolution 3840x1080:
Find the lpviz driver folder
Point SteamVR to the driver directory using the following command (adapt paths to your setup) on any command shell:
"C:\Program Files (x86)\Steam\steamapps\common\SteamVR\bin\win64\vrpathreg.exe" adddriver C:\path_to_lpviz_driver_base_directoryStart SteamVR by running
C:\Program Files (x86)\Steam\steamapps\common\SteamVR\bin\win64\vrmonitor.exe. You can also start it through Steam. If the driver starts correctly, you should see the following SteamVR output:
Note: You can ignore the Headset Notice, or simply click on Dismiss. Running the headset as a monitor doesn’t impact performance.
Optical Tracking of HMD
There are currently two different tracking systems supported. If you are using the Optitrack System, please follow these instructions. If you are using the ART Smarttrack System, please follow these instructions.
Using Optitrack for HMD tracking
Add the following block to settings.json of the LPVIZ driver, if it’s not alredy there. Make sure to set IP addresses according to your setup.
{
"PoseMachineConfig": {
"plugins": [ "OptiTrack" ],
"absoluteSources": {
"optitrack": {
"settings": {
"connectionType": "Multicast",
"localAddress": "127.0.0.1",
"remoteAddress": "127.0.0.1",
"serverCommandPort": 1510,
"serverDataPort": 1511
},
"type": "OptiTrack"
},
...Set the correct ID of the marker target as it’s tracked in Optitrack here:
...
"trackedObjects": [
{
"absoluteSource": {
"name": "optitrack",
"trackingId": 1
},
...Make sure your optical target in Motive is correctly configured. The local coordinate frame of the HMD should look like the SteamVR standard coordinate system shown below:
A possible marker target configuration is shown below:
The IMUs are defined in the following block:
"imuSources": {
"hmd imu": {
"settings": {
"rateHz": 1000
},
"type": "XrealAir2Ultra"
},
"noimu": {
"settings": {},
"type": "None"
},
"platform imu": {
"settings": {
"name": "ig1232000389"
},
"type": "OpenZen"
}
}Using ART Smarttrack for HMD tracking
Add the following block to settings.json of the LPVIZ driver, if it’s not already there. Make sure to set port that is also set in the Dtrack software. Remove any
optitrackentries if present.
{
"PoseMachineConfig": {
"absoluteSources": {
"art": {
"settings": { "port": 5000 },
"type": "DTrack"
}
},
...Set the correct ID of the marker target as it’s tracked in Dtrack here:
... "trackedObjects": [ { "absoluteSource": { "name": "art", "trackingId": 1 }, ...The IMUs are defined in the following block:
... "imuSources": { "hmd imu": { "settings": { "rateHz": 1000 }, "type": "XrealAir2Ultra" }, "platform imu": { "settings": { "name": "IG12327002E003C" }, "type": "OpenZen" } }, ...So your settings.json should look like this now:
{ "PoseMachineConfig": { "absoluteSources": { "art": { "settings": { "port": 5000 }, "type": "DTrack" } }, "imuSources": { "hmd imu": { "settings": { "rateHz": 1000 }, "type": "XrealAir2Ultra" }, "platform imu": { "settings": { "name": "IG12327002E003C" }, "type": "OpenZen" } }, "trackedObjects": [ { "absoluteSource": { "name": "art", "trackingId": 1 }, ...
Camera Attachment
Make sure the Smarttrack is attached to the car in parallel to the car frame. See a possible arrangement below.
Calibration of Optical Target
The sensor fusion in the LPVIZ driver merges optical tracking data and inertial data from headset and platform IMU. In order for this calculation to work correctly, we must make sure all data sources operate in the same coordinate frame. To provide this, we do two calibration steps:
Calibration of headset IMU and optical tracking
Calibration of platform IMU (the IMU fixed to the tracking camera) and optical tracking
Under normal circumstance you need to do this calibration procedure only once per setup.
Headset IMU vs. Optical Tracking
As a first step, make sure you have the optical tracking body is set up in DTrack. The body doesn’t have to be in any specific alignment, just make sure to select
Set origin to COGto set the origin to the center of the optical target. With the following calibration we’ll make sure that the orientation alignment between HMD target and HMD IMU is adjusted. See the example below:
Note: We assume XReal to place the IMU coordinate system in correct alignment to the visual coordinate system of the glasses. Therefore we use the IMU orientation as reference and modify the optical input data to match the IMU’s frame.
For calibrating the IMU of the HMD to optical tracking body, we can use the calibration tool of the configuration page. Make sure the HMD is connected and SteamVR is running and detected the HMD. Then you can open http://localhost:7119/ in your browser and see the following.
The page mentions LPVR-CAD but works for LP-VIZ as well. You can also verify the status of the tracking system and HMD IMU here.
In the Additional Features section in expert mode, select your optical tracking source (usually art), tracking ID of HMD marker target in DTrack (usually 1) and IMU source (usually hmd imu), as shown below.
Start the calibration by clicking on Find Rotation. Rotate the HMD slowly in front of the camera.
The pose matching should continuously progress.
After 50 poses have been recorded the calibration will output the rotation between IMU and optical tracking as quaternion.
Paste this quaternion into the configuration as
absoluteFromImuFrameQuat.{ "PoseMachineConfig": { "absoluteSources": { "art": { "settings": { "port": 5000 }, "type": "DTrack" } }, "imuSources": { "hmd imu": { "settings": { "rateHz": 1000 }, "type": "XrealAir2Ultra" }, "platform imu": { "settings": { "name": "IG12327002E003C" }, "type": "OpenZen" } }, "trackedObjects": [ { "absoluteSource": { "name": "art", "trackingId": 1 }, "combinerType": "DifferentialImu", "emitterName": "HMD", "enabled": true, "imuSource": "hmd imu", "settings": { "absoluteFromImuFrameQuat": { "w": 0.9952856708607908, "x": -0.09611135784177455, "y": -0.009899980062638399, "z": -0.008427969373189247 }, "opticalWeight": 0.001, ...
Set
opticalWeightto0.001.
Intended result:
When you look through the glasses now, running only the SteamVR default scene with the coordinate frame, you should see that the origin of the SteamVR frame is in the center between the Smarttrack tracking cameras. The frame should be pointing foward towards the camera. The horizon should be parallel to the camera plane and to the ground. Adjust the gear that holds the Smarttrack so that the horizon is in correct alignment with the (real) ground.
Rotations and translations should be shown smoothly in VR. When turning the head sideways and stopping, the horizon should not drift or at least only show a very minimal drift.
We run this calibration to make the system perform correctly under static conditions, ie. when the car isn’t moving. This is a requirement for adding data from the platform IMU to make in-car tracking work when the vehicle is in motion.
Platform IMU vs. Optical Tracking
In this step we calibrate the alignment of the platform IMU on the Smarttrack 3. For this purpose we do a similar calibration as above, but inverse. This means we keep the HMD resting and rotate the cameras around it.
To do this go once again to the calibration menu and adjust in the following way. Note that
inverse calibrationis checked.
As before, click on Find Rotation to start the calibration process. In the video below we show how this is done.
After recording 50 poses the resulting quaternion will be shown. Paste it into the configuration as
referenceOrientationQuat.
{
"PoseMachineConfig": {
"absoluteSources": {
"art": {
"settings": { "port": 5000 },
"type": "DTrack"
}
},
"imuSources": {
"hmd imu": {
"settings": { "rateHz": 1000 },
"type": "XrealAir2Ultra"
},
"platform imu": {
"settings": { "name": "IG12327002E003C" },
"type": "OpenZen"
}
},
"trackedObjects": [
{
"absoluteSource": {
"name": "art",
"trackingId": 1
},
"combinerType": "DifferentialImu",
"emitterName": "HMD",
"enabled": true,
"imuSource": "hmd imu",
"settings": {
"absoluteFromImuFrameQuat": {
"w": 0.9952856708607908,
"x": -0.09611135784177455,
"y": -0.009899980062638399,
"z": -0.008427969373189247
},
"opticalWeight": 0.001,
"referenceImu": "platform imu",
"referenceOrientationQuat": {
"w": 0.513001807618862,
"x": -0.48739942766068073,
"y": 0.4896693148934658,
"z": 0.509406424525236
}
"referenceToOpticalQuat": {
"w": 1,
"x": 0,
"y": 0,
"z": 0
}
}
}
]
}
}Note that we don’t change referenceToOpticalQuat. It keeps it default value, identity quaternion.
Intended result:
Regardless if the car is moving or static, tracking should work correctly. When the car drives a curve, focus on the location of the origin of the SteamVR coordinate frame. Same as in the static case, the origin should stay fixed in the center inbetween the Smarttrack cameras.
When going up and down a slope, the horizon should stay locked to the car and not change its angle relative to the car.
LPPOS setup (vehicle localization)
Overview
System components
LPPOS box | OBD to CAN bus cable |
GNSS antenna | OBD-II to USB cable |
LPPOS box to host cable |
|
LPPOS software (FusionHub)
The FusionHub application is
FusionHub.exe. Start it.Run the FusionHub with
fusionhub_control.exe.FusionHub needs a
config.jsonfile in the directory where you run it from. The complete documentation for the LPPOS software is here: LPVR-POS Manual
GNSS-IMU filter configuration
"gnssImuFusion": {
"publishIntervalMs": 10,
"dataEndpoint": "inproc://fusion_data",
"fuser": {
"_globalOrigin": {
"latitude": 35.657632,
"longitude": 139.73226516666668
},
"accelError": 0.01,
"debugFilter": false,
"fitModel": "ModelGnssImu",
"imuQueueDelayMs": 20,
"imuToCarRotation": {
"w": 1,
"x": 0,
"y": 0,
"z": 0
},
"initializeFromGnssOrientation": true,
"measurementError": 0.05,
"omegaError": 0.1,
"orientationFromGnssError": 0.1,
"outputRawGnssData": false,
"outputWhenFilterNotReady": false,
"requireRtkFix": true,
"retainStateWhenStopped": false,
"setHeightToZero": true,
"smoothFit": true,
"useGnssOrientationMeasurement": true
},
"inputEndpoints": [
"inproc://vehicle_data",
"inproc://gnss_data",
"inproc://imu_data"
]
}By default the coordinate origin will be in the location where you start the application. This location will be (0, 0) of the planar ENU coordinates sent to Unreal Engine. You can set the origin to a specific location by defining globalOrigin:
"globalOrigin": {
"latitude": 35.657632,
"longitude": 139.73226516666668
}Full Vehicle Fusion Filter Configuration
FusionHub introduces a new Full Vehicle Fusion mode in which two estimators run in parallel: a GNSS-IMU fusion (the standard configuration) and an odometry-GNSS-IMU fusion.
The odometry-GNSS-IMU fusion operates in 2D and continuously generates a virtual GNSS signal derived from vehicle odometry and IMU data. This virtual GNSS output is then fed into the GNSS-IMU fusion pipeline.
The key advantage of this architecture is improved robustness during GNSS outages. In scenarios such as tunnel driving or urban canyons, where real GNSS measurements may be unavailable or degraded, the system continues to produce a stable, full 3D vehicle pose by relying on the virtual GNSS signal computed from odometry and IMU inputs.
Note that this approach requires access to low-latency vehicle odometry data. If such data is not available or exhibits excessive latency, this feature cannot be used.
The filter is configured in the following way:
"fullVehicleFusion": {
"dataEndpoint": "inproc://fusion_data",
"publishIntervalMs": 10,
"settings": {
"useDirectGnss": false,
"odometryImuFusion": {
"driveModel": "Differential",
"fitModel": "SimpleCarModel",
"imuTransform": {
"w": 1,
"x": 0,
"y": 0,
"z": 0
},
"imuTurnRateAxis": {
"x": 0,
"y": 0,
"z": 1
},
"initializeFromGnssOrientation": true,
"measurementError": 0.05,
"omegaError": 0.01,
"smoothFit": true,
"trackWidthM": 1.56,
"transformGnssOrientation": false,
"useImuTurnRate": true,
"utmZone": "31T",
"velError": 0.277777778,
"wheelBaseM": 2.597,
"useGpsOnRtkFloat": false,
"gnssFixTransitionIgnoreSamples": 3
},
"gnssImuFusion": {
"debugFilter": false,
"enableStatePersistence": false,
"fitModel": "ModelGnssImu",
"globalOrigin": {
"latitude": 50.148147,
"longitude": 8.824422
},
"imuQueueDelayMs": 0,
"imuToCarRotation": {
"w": 1,
"x": 0,
"y": 0,
"z": 0
},
"initializeFromGnssOrientation": true,
"orientationFromGnssError": 0.001,
"accelError": 0.01,
"omegaError": 0.005,
"measurementError": 0.001,
"outputRawGnssData": false,
"outputWhenFilterNotReady": false,
"requireRtkFix": true,
"retainStateWhenStopped": true,
"smoothFit": true,
"transformGnssOrientation": true,
"useGnssOrientationMeasurement": true
}
},
"inputEndpoints": [
"inproc://vehicle_data",
"inproc://gnss_data",
"inproc://imu_data"
]
}GPS module
Connect the LPPOS box to your computer.
Check which COM port the GPS module in the box is using by looking at Windows device manager. It is one of the Silicon Labs virtual COM ports that show up. At the moment we don’t have a method to see which one specifically. Use the UPrecise application below to check which one works. Usually the COM port assignment doesn’t change when using the box on the same computer.
Download the UPrecise app from here: https://docs.holybro.com/gps-and-rtk-system/h-rtk-unicore-um982/download
Connect to the GPS unit using the UPrecise app. If you are outside and the antennas are connected you should see the box finding a few GPS satellites. Check if data is being received, specifically GGA, GSA, VTG and HDT messages.
If everything seems fine, disconnect the UPrecise application.
Adjust the COM port in the corresponding block in the FusionHub configuration:
"gnss": {
"dataEndpoint": "inproc://gnss_data",
"inputEndpoints": [
"inproc://rtcm_data"
],
"settings": {
"baudrate": 460800,
"port": "COM9",
"rtcm": true,
"dualGPS": true
},
"type": "NMEA"
},CAN bus interface (odometry data, option 1)
In case you’re reading CAN bus data directly from your vehicle bus, download the PCAN view app to check if your CAN bus configuration is correct: https://www.peak-system.com/PCAN-View.242.0.html?L=1
Connect your vehicle CAN bus to the LPPOS box CAN connector.
In PCAN view set the correct baud rate and check that data is coming in. Confirm that the message transporting vehicle speed is arriving.
Configure the vehicle node in FusionHub so that it parses the vehicle speed correctly. For example the vehicle speed CAN message specification for a Tesla Model 3 is:
CAN ID | 0x257 |
Bus | 500 kbps |
Endianness | Little |
Bit start | 12 |
Length | 12 bits |
Factor | 0.08 kph per bit |
Offset | −40.0 kph |
The resulting configuration block for FusionHub looks like this:
"vehicle": {
"dataEndpoint": "inproc://vehicle_data",
"publishIntervalMs": 100,
"settings": {
"baudrate": 500000,
"canInterface": "PeakCAN",
"canProtocol": {
"velocityCanId": 599,
"velocityStartBit": 12,
"velocityBitsLength": 12,
"velocityScale": 0.08,
"velocityOffset": -40.0,
"endianness": "little"
},
"trackWidth": 1.58,
"vehicleType": "Minimal",
"wheelBase": 2.746
},
"type": "Automotive"
},OBD-II interface (odometry data, option 2)
In case you don’t have the option to read data directly from the CAN bus of your vehicle, vehicle speed data can also be read via the OBD diagnostics port. Use an OBD2 to USB interface with an ELM327 chip.
"elm327": {
"dataEndpoint": "inproc://vehicle_data",
"settings": {
"baudrate": 115200,
"isPorsche": false,
"port": "COM7"
}
}RTK basestation service
Set up the RTK GPS base station service in FusionHub.
Caster network in California with high density: https://sopac-csrc.ucsd.edu/index.php/crtn/. Closest caster to TRI office is located at SLAC, configured as follows:
"RTCM": {
"dataEndpoint": "inproc://rtcm_data",
"settings": {
"forwardGnss": true,
"host": "132.239.152.4",
"gnssEndpoint": "inproc://gnss_data",
"initialLatitude": 59.457424,
"initialLongitude": 15.302399,
"mountpoint": "SLAC_RTCM3",
"password": "XIKAKUSURV",
"port": "2103",
"user": "CRTNXIKAKU",
"userAgent": ""
},
"type": "NTRIP"
}Alternatively, RTK2GO free caster list: http://monitor.use-snip.com/?hostUrl=rtk2go.com&port=2101
"RTCM": { // California Rtk2Go
"dataEndpoint": "inproc://rtcm_data",
"settings": {
"forwardGnss": true,
"gnssEndpoint": "inproc://gnss_data",
"host": "3.143.243.81",
"initialLatitude": 33.88,
"initialLongitude": -117.92,
"mountpoint": "HCPV",
"password": "none",
"port": "2101",
"user": "klaus@lp-research.com",
"userAgent": "LPVR"
},
"type": "NTRIP"
},Free caster setting in San Jose:
"RTCM": { // California Rtk2Go
"dataEndpoint": "inproc://rtcm_data",
"settings": {
"forwardGnss": true,
"gnssEndpoint": "inproc://gnss_data",
"host": "3.143.243.81",
"initialLatitude": 33.88,
"initialLongitude": -117.92,
"mountpoint": "CA_SanJose_ML_X5",
"password": "none",
"port": "2101",
"user": "klaus@lp-research.com",
"userAgent": "LPVR"
},
"type": "NTRIP"
},Fusionhub Output
Sending FusionHub Data to External Applications via the ZeroMQ Interface
FusionHub emits data resulting from the sensor fusion through the local network (TCP/IP) interface via Protobuf-encoded ZeroMQ. This method is used internally by FusionHub to connect its various nodes. Each output node can be configured to stream data not only internally but also externally.
Output Ports
The network port that this information is output to can be configured in the JSON parameter file config.json of FusionHub. Usually each FusionHub node contains a parameter setting such as "endpoint": "tcp://*:8899" that allows settings the output port. The output from the node can then be accessed by connecting to the IP of the computer that FusionHub is running on to the defined port.
Data Format
As low level protocol to emit the output data we use ZeroMQ (publisher / subscriber). The data itself is encoded as Protocol Buffers. Protocol Buffers are documented here. Messages are defined in the Protobuf (.protoc) format as defined in the file stream_data.proto. This file is contained in the installation folder of FusionHub.
Python Resources
Download a Python example that shows how to decode messages from FusionHub from this repository.
Prerequisites can be installed in your Python 3 environment with this:
pip install zmq
pip install protobufMake sure to set the input port in FusionHubPythonExample.py correctly. For example for the vehicularFusion source definition like below, the port needs to be set to 8899.
"vehicularFusion": {
"dataEndpoint": "tcp://*:8899",
"fuser": {
"driveModel": "Differential",
"fitModel": "SimpleCarModel",
"gnsFixTransitionIgnoreSamples": 3,
"imuTransform": {
"w": 1,
"x": 0,
"y": 0,
"z": 0
},
"imuTurnRateAxis": {
"x": 0,
"y": 0,
"z": 1
},
"initializeFromGnssOrientation": true,
"measurementError": 0.05,
"omegaError": 0.01,
"smoothFit": true,
"trackWidthM": 1.56,
"transformGnssOrientation": false,
"useImuTurnRate": true,
"utmZone": "31T",
"velError": 0.277777778,
"wheelBaseM": 2.597,
"useGpsOnRtkFloat": false
},
"inputEndpoints": [
"tcp://localhost:9921",
"inproc://vehicle_data",
"inproc://gnss_data",
"inproc://imu_data"
]C# Resources
On parsing Protobuf files: https://github.com/5argon/protobuf-unity
How to subscribe to ZeroMQ messages: https://github.com/gench23/unity-zeromq-client and https://tech.uqido.com/2020/09/29/zeromq-in-unity/
VRPN Output (Unity)
VRPN output is set in the following part in the sinks section of config.json. The device name will be referenced by the plugin for Unreal engine.
"VRPN": {
"settings": {
"deviceName": "Fusion Hub"
}
} Please see below how we achieve data input via VRPN in the Unreal engine. First, install the VRPN LiveLink plugin:
Configure the VRPN source with the correct device and subject name:
Apply the output from the fusion hub to an Unreal object eg. a cine camera actor.
Serial Interface
By adding the following to the configuration file, the fused pose is passed to a serial port that can be configured (COM1 in this example).
"nmeaOutput": {
"inputDataFilter": [
"FusedVehiclePoseV2"
],
"type": "serial",
"port": "COM1",
"baudrate": 115200
},The output comes in the NMEA format and should be able to be parsed by all NMEA capable viewers or devices. Some message types (e.g. HDT) are not supported by all viewers but should be excluded automatically.
$GPRMC,102019.47,A,3539.4509,N,13943.9400,E,0.0,0.0,180226,,,*1D
$GPVTG,0.0,T,,M,0.0,N,0.0,K*60
$GPGGA,102019.47,3539.4509,N,13943.9400,E,1,10,1.0,0.0,M,0.0,M,0.0,0000*7F
$GPHDT,360.0,T*30
$GPRMC,102019.47,A,3539.4509,N,13943.9400,E,0.0,0.0,180226,,,*1D
$GPVTG,0.0,T,,M,0.0,N,0.0,K*60
$GPGGA,102019.47,3539.4509,N,13943.9400,E,1,10,1.0,0.0,M,0.0,M,0.0,0000*7F
$GPHDT,360.0,T*30
$GPRMC,102019.47,A,3539.4509,N,13943.9400,E,0.0,0.0,180226,,,*1D
$GPVTG,0.0,T,,M,0.0,N,0.0,K*60
$GPGGA,102019.47,3539.4509,N,13943.9400,E,1,10,1.0,0.0,M,0.0,M,0.0,0000*7F
$GPHDT,360.0,T*30
$GPRMC,102019.47,A,3539.4509,N,13943.9400,E,0.0,0.0,180226,,,*1D
$GPVTG,0.0,T,,M,0.0,N,0.0,K*60
$GPGGA,102019.47,3539.4509,N,13943.9400,E,1,10,1.0,0.0,M,0.0,M,0.0,0000*7F
$GPHDT,360.0,T*30
$GPRMC,102019.47,A,3539.4509,N,13943.9400,E,0.0,0.0,180226,,,*1D
$GPVTG,0.0,T,,M,0.0,N,0.0,K*60
$GPGGA,102019.47,3539.4509,N,13943.9400,E,1,10,1.0,0.0,M,0.0,M,0.0,0000*7F
$GPHDT,360.0,T*30
$GPRMC,102019.48,A,3539.4509,N,13943.9400,E,0.0,0.0,180226,,,*12
$GPVTG,0.0,T,,M,0.0,N,0.0,K*60
$GPGGA,102019.48,3539.4509,N,13943.9400,E,1,10,1.0,0.0,M,0.0,M,0.0,0000*70
$GPHDT,360.0,T*30
$GPRMC,102019.48,A,3539.4509,N,13943.9400,E,0.0,0.0,180226,,,*12
$GPVTG,0.0,T,,M,0.0,N,0.0,K*60
$GPGGA,102019.48,3539.4509,N,13943.9400,E,1,10,1.0,0.0,M,0.0,M,0.0,0000*70
$GPHDT,360.0,T*30
$GPRMC,102019.48,A,3539.4509,N,13943.9400,E,0.0,0.0,180226,,,*12
$GPVTG,0.0,T,,M,0.0,N,0.0,K*60
$GPGGA,102019.48,3539.4509,N,13943.9400,E,1,10,1.0,0.0,M,0.0,M,0.0,0000*70Example output file:
ROS2 Output
To run FusionHub with ROS2 support you need a special version of FusionHub that’s been built with the ROS2 libraries.
Simple ROS2 output
In order to enable ROS2 output for LPPOS / FusionHub, FusionHub needs to be run from a ROS2 shell. Setup ros2 and fastrps for best performance
FusionHub has a special ROS2 output node to convert FusionHub protocol messages to the ROS2 network:
"ros2Publisher": {
"frameId": "oxts_link",
"imuTopic": "/imu/data",
"inputEndpoints": [
"inproc://fusion_data"
],
"publishTf": true,
"tfChildFrame": "oxts_link",
"tfParentFrame": "map",
"tfRotationOffset": {
"w": 0,
"x": 1,
"y": 0,
"z": 0
}
}If the GNSS IMU fusion is outputting FusedVehiclePoseV2 and is connected to ros2Publisher, the node should output transform (tf) messages configured as shown above. See more documentation about tf messages is here: https://ros2-industrial-workshop.readthedocs.io/en/latest/_source/navigation/ROS2-TF2.html#
Note: FusedVehiclePoseV2 only supports yaw orientation changes. This can make it easier to test system performance when driving on a flat, horizontal surface like a parking lot. In uneven, or not completely horizontal terrain virtual objects displayed outside the car won’t be displayed correctly. To output roll, pitch and yaw to Unreal Engine, use the FusedPose as shown below.
Full orientation output (roll, pitch, yaw)
To output full orientation, test with the configuration above first to see if everything works well.
Besides FusedVehiclePoseV2 the GnssImuFusion node outputs the FusedPose data structure that contains the x, y, z position of the car, as well as the full 3d orientation. To route the FusedPose to Unreal Engine use the following block.
"ros2Publisher": {
"inputDataFilter": [
"FusedPose"
],
"frameId": "oxts_link",
"imuTopic": "/imu/data",
"inputEndpoints": [
"inproc://fusion_data"
],
"publishTf": true,
"tfChildFrame": "oxts_link",
"tfParentFrame": "map",
"tfRotationOffset": {
"w": 0.5,
"x": 0,
"y": 0,
"z": 0.5
},
"invertYaw": true
}Recording and Replaying Data
Data Recording
You can record the output from FusionHub to a file by adding the following lines to the sink section of config.json. All JSON output that's printed to the screen during the operation of FusionHub will be written into the log file.
"logger": {
"inputDataFilter": [
"FusedVehiclePoseV2"
],
"inputEndpoints": [
"inproc://prediction_output",
"inproc://fusion_data_output"
],
"settings": {
"filename": "FileToLogTo.json"
}
}Parameter name | Description | Default |
|---|---|---|
filename | Filename of the file to be recorded. | driveData.json |
format | The file format. At the moment only JSON is possible. | json |
Data Replay
Replay executable
In order to replay data from a JSON file a separate application ReplayExecutable.exe that we deliver with LPVR-POS is used. The replay executable reads the JSON data from a defined file, pushes the data to a replay queue and sends them to the network (tcp://localhost:9921 by default). To run the ReplayExecutable use the following command:
ReplayExecutable.exe -r <path/to/file.json> [--replay-speed 1] [--queue-size 100] [--echo-data] [--verbose]Key | Description | Type | Example value |
|---|
Key | Description | Type | Example value |
|---|---|---|---|
-r | Path to read in file | String | “log.json” |
--replay-speed | Speed to the actual recording | Double | 1 |
--queue-size | The size of queue that file reader would stop pushing new data to the replay queue. Increase this value when you see lots of data is published at the same time when running with | Integer | 100 |
--echo-data | Listen to the publishing endpoint and display the replayed data | N/A | N/A |
--verbose | Print the debugging information, i.e., the timestamp a packet is added to the replay queue, replayed from the replay queue, and discarded from the replay queue. | N/A | N/A |
FusionHub can receive data from the replay application by adding the replay application’s endpoint source to the configuration file:
{
...,
"sources": {
"endpoints": ["tcp://localhost:9921"]
}
}The Replay executable allows to replay the recorded data in order to use it as input to the fusion nodes. For that please keep the same “sinks“ block that was used in the live testing. The “sources“ block should be replaced by just the replay endpoint as shown above instead of the different hardware data sources.
Troubleshooting LPPOS
Checking if data is arriving correctly from all components
For the LPPOS fusion to run correctly the following data is required:
VehicleSpeed data
IMU data
RTCM (RTK correction) data
GNSS data
After connecting the GUI you should be able to see the data counters for these messages increasing. The counters are located on the dashboard page of the UI:
Once the car is outside and RTCM correction data is received correctly, GNSS Quality should switch to RTK-fixed after around a minute. This is required for gppd tracking quality. The system can also operate with normal GPS, but in this case will be less accurate.
Another way to see if these datastreams are flowing correctly is to output them to the console.
In the config file there’s the following section:
"echo": {
"inputDataFilter": [
"VehicleSpeed"
],
"inputEndpoints": [
"inproc://gnss_data",
"inproc://vehicle_data",
"inproc://rtcm_data",
"inproc://imu_data"
]
},Make sure that this section isn’t commented with a _ before the echo.
Set the inputDataFilter to the data type you’d like to monitor:
VehicleSpeedImuDataGnssDataRTCMData
Then start the FusionHub. The data you set the filter to should now be output to the console.
Checking localization output
You can see the output of the LPPOS fusion in the map view of FusionHub UI and once ROS2 communication with Unreal engine is established also in the Unreal engine project. This should look similar to the video below. The map view is on the left side of the video, the Unreal engine output on the right.
ROS2 communication
Make sure you start FusionHub from a ROS2 enabled shell, ie. call local_setup.bat before starting FusionHub. The script start_lppos_ros2.bat does this automatically.
The ROS2 output section in the FusionHub config looks like this:
"ros2Publisher": {
"inputDataFilter": [
"FusedPose"
],
"_inputDataFilter": [
"FusedVehiclePoseV2"
],
"frameId": "oxts_link",
"imuTopic": "/imu/data",
"inputEndpoints": [
"inproc://fusion_data"
],
"publishTf": true,
"tfChildFrame": "oxts_link",
"tfParentFrame": "map",
"tfRotationOffset": {
"w": 0.5,
"x": 0,
"y": 0,
"z": 0.5
},
"_tfRotationOffset_for_FusedVehiclePoseV2": {
"w": 1,
"x": 0,
"y": 0,
"z": 0
},
"invertYaw": true
}While running FusionHub when localization information is calculated the ROS2 node will output /tf frames. By default these frames have the child frame name oxts_link and parent frame map. They can be monitored using ros2 topic echo /tf.
Check if your com ports are correct
Some sensors like GNSS and OBD will show up as COM ports on your computer. The COM port number in your system and in the config file needs to be the same. The assigned numbers depend on your system, but will generally stay the same unless more components are added to the system. So you should only need to set the correct COM ports once in the config file.
To find the COM port numbers, plug in all sensors and open the Windows device manager, by pressing the windows key and typing “device“.
In the device manager, find “Ports (COM & LPT)” and open it. There you will find the assigned COM ports.
If you are unsure which COM port belongs to which device, plug one out and check which COM port is removed from the device manager.
Common Log File Errors/Warnings
Couldn’t connect to device xxx
See example in this picture (highlighted message on the left) and related problem that on FusionHub GUI Dashboard no data is coming in from IMU (Zero messages from IMU on the right).
The startup log actually tells you the device ID of the found IMU (see highlighted part of left screenshot). Put that ID into the FusionHub config file (see right screenshot).