...
FusionHub BASE combines data from an outside-in tracking system with inertial measurements via an IMU. Typical applications: Head-mounted display tracking for VR/AR applications, camera tracking for virtual production
FusionHub MOVE adds an additional platform IMU to the BASE configuration. It combines the data from both IMUs to calculate poses relative to a moving platform. Typical applications: AR/VR in a vehicle, aircraft, or on a simulator platform
FusionHub FLOW combines odometry, GPS and IMU data from a vehicle to calculate high-accuracy and low-latency global localization information. Typical applications: Automobile localization, robot localization
The diagram below shows the general structure of FusionHub. Sources and sinks are connected by a filter unit. The sensor fusion functionality is contained in this filter unit. The filter parameters as well as the parameters of input and output blocks can be configured via a configuration script or the graphical user interface.
The graphical user interface is detached from the main FusionHub application and both applications can therefore run on separate computers. This provides flexibility for running FusionHub on devices with limited monitoring capabilities like a head mounted display.
...
General
Running FusionHub
FusionHub consists of two components:
The main application
A graphical user interface application
...
The main FusionHub application is started by running FusionHub.exe
. No specific installation is needed, the application can be run directly out of its deployment directory. It is a command line application that uses the file config.json
for its configuration. We will explan the contents and options of the configuration file further below.
Please install the graphical user interface by running lp-fusionhub-dashboard_0.1.0_x64_en-US.msi
. It installs lp-fusionhub-dashboard
in your start menu, launch the application from there. Press the Connect
button after starting FusionHub.exe
to connect client and server. In case you are running FusionHub on a separate machine make sure to enter the correct IP address.
Communication with External Applications
Sending FusionHub Data to External Applications via the ZeroMQ Interface
FusionHub emits the result data of resulting from the sensor fusion through the local network interface.
...
The network port that this information is output to can be configured in the JSON parameter file config.json
of FusionHub.
Data Format
The As low level protocol that this data is output by is to emit the output data we use ZeroMQ (publisher / subscriber). The data itself is in JSON format and is encoded as Protocol Buffers as is described . Protocol Buffers are documented here. Message are defined in the Protobuf (.protoc) format as defined in the file stream_data.proto
. This file is contained in the installation folder of FusionHub.
...
Download a Python example that shows how to decode messaged from FusionHub from this repository.
Prerequisites can be installed in your Python 3 environment with this:
...
Make sure to set the input port in FusionHubPythonExample.py correctly. For example for the Antilatency source definition like below, the port needs to be set to 8899
.
Code Block |
---|
"optical": { "type": "Antilatency", "settings": { // Use this for access from an external process eg. ALVR "endpoint": "tcp://*:8899", // Use this for internal access eg. sensor fusion //"endpoint": "inproc://optical_data_source_1", "environmentLink": "AntilatencyAltEnvironmentHorizontalGrid~AgAEBLhTiT_cRqA-r45jvZqZmT4AAAAAAAAAAACamRk_AQQCAwICAgICAQICAAI", "placementLink": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" } }, |
C# Resources
On parsing Protobuf files: https://github.com/5argon/protobuf-unity
...
Code Block |
---|
"VRPN": { "settings": { "deviceName": "Fusion Hub" } } |
In Please see below how we achieve data input via VRPN in the Unreal engine. First, install the VRPN LiveLink plugin:
...