Application Setup
Download and install
LpmsControl 2
from here.Open the FusionHub ZIP file and copy the contents to a local folder.
Edit
config.json
and adjust parameters as needed. The defaultconfig.json
looks like below:
Items to adjust in config.json
:
Optical tracking system definition (DTrack, Optitrack)
License key (provided by LP)
Intercalibration result (see below)
Have a look at the comments in config.json
to see what goes where.
3. To start the fusion hub, execute StartFusionHub.bat
Alternatively you can also start the fusion hub manually from an elevated (administrator) command line with ./websocketd.exe -staticdir SurviveViewer -port 8080 FusionHub.exe -c config.json
Command line output will look like this:
When started, the fusion hub batch file will open the following two browser windows pointing to http://localhost:8080/
and index.html
. These windows contain a 3D representation of the acquired data and the parameter settings / status output.
Sensor fusion parameters (editable): These parameters control the functionality of the sensor fusion. Usually these don’t need to be changed. We will add a more detailed description of these parameters later on.
Autocalibration status (read-only): Shows the status of the license check (
true
-> check passed), the autocalibration status (true
-> autocalibration finished), and the amount of IMU / optical samples received.Intercalibration status (read-only): Shows current status of the intercalibration. Once the intercalibration is finished,
finished
becomestrue
.nPoses
needs to reach40
for the intercalibration quaternion to be calculated. The result will be displayed inquat
. Apply the values in quat to the intercalibration quaternion inconfig.json
.
4. Check if the application is receiving IMU data. nImu
should be steadily increasing. If not, check the IMU settings in config.json
and make sure data can be received by running LpmsControl 2
.
5. Check if the application is receiving optical data. nOptical
should be steadily increasing.
Intercalibration
The intercalibration calculates the orientation offset between the optical tracing body and the IMU attached to it. This calibration is needed for drift free fusion of these two data sources.
Place the sensor incl. the optical target in a motion-less state. Wait for the autocalibration to fininsh (
autocalibrationStatus
->true
).Slowly rotate the sensor (with the optical target attached) within the tracking volume until
finished
in Intercalibration status becomestrue
.Copy and paste the output quaternion to intercalibration section in
config.json
.Restart the application to apply the intercalibration values.
Normal Operation
For normal operation the autocalibration needs to be completed after starting the application. The intercalibration only needs to be run once for a fixed alignment of the optical target and IMU.
Start the application via the batch file or the command line
Wait for the autocalibration to finish
Apply fused data to Unreal scene
Output to Unreal Configuration
VRPN output is set in the following part in the
sinks
section ofconfig.json
. The device name will be referenced by the plugin for Unreal engine.
"VRPN": { "settings": { "deviceName": "Fusion Hub" } }
2. In Unreal engine install the VRPN LiveLink plugin:
3. Configure the VRPN source with the correct device and subject name:
4. Apply the output from the fusion hub to an Unreal object eg. a cine camera actor.
Config.json Sample
{ // Definition of data sources "sources": { // Settings for optical source or proxy "optical": { // Uncomment the lines below to enable Optitrack support // Make sure to comment the ART tracking related lines //"type": "Optitrack", //"settings": { // "host": "localhost", // "connectionType": "Multicast", // "bodyID": 0 // } "type": "DTrack", "settings": { "port": 5000, "bodyID": 1 } } // IMU source address, localhost if run on the same node , // "endpoints": [ "tcp://localhost:8765" ] "imu": { "type": "OpenZen", "settings": { "autodetectType": "ig1" } } // File reader node to replay sensor data //, "filereader": { // "filename": "sensorDataWithTimecode.json" // } }, // Definition of data sinks "sinks": { "VRPN": { "settings": { "deviceName": "LPVP" } } // Sensor fusion node with result quaternion of intercalibration inserted , "fusion": { // Fill in your license key here "LicenseInfo": { "LicenseKey": "", "ResponseKey": "" } // Fill in the result from the intercalibration here , "intercalibration": { "w":1.0, "x":0.0, "y":0.0, "z":0.0 } } // Uncomment the following line to activate intercalibration , "intercalibration": {} // Uncomment the following line to output data to command line //, "echo": {} // Uncomment the following block to replay data //, "replay": { // "settings": { // "bufferSize": 500e-3, // "stepDuration": 0, // "writerEndpoint": "tcp://*:9921", // "verbose": true, // "timecodeInput": true // } // } // Uncomment following block to record result data to file in JSON format //, "record": { // "filename": "log.a", // "format": "json" //} } }
Sending FusionHub Data to External Applications via Native ZeroMQ Interface
FusionHub emits the following data through the local network interface:
Optical pose as acquired from Antilatency tracking
The fused pose of the HMD ie. the combined data from HMD IMU and ALT tracking
Output Ports
The ports that this information is output to can be configured in the JSON parameter file config.json
of FusionHub. The following lines defines these ports:
Antilatency optical poses
... // Antilatency "type": "Antilatency", "settings": { // Use this for access from an external process eg. ALVR "endpoint": "tcp://*:8899", ...
Fused HMD Poses
... // Sensor fusion node with result quaternion of intercalibration inserted "fusion": { // Use this for access from an external process eg. ALVR "endpoint": "tcp://*:8799", ...
Data Format
The low level protocol that this data is output by is ZeroMQ (publisher / subscriber). The data itself is in JSON format and is encoded as Protocol Buffers as is described here.
Message are defined in Protobuf (.protoc) format:
Antilatency optical poses
message OpticalData { string object_name = 11; int64 timecode = 1; bool fake_timecode = 10; int64 recorded_time = 2; double latency = 9; Vector position = 3; Quaternion orientation = 4; // Errors should be encoded here as well, but they actually are specific to the source // something for errors = 5; double quality = 6; double frame_rate = 7; // in seconds, necessary? int32 frame_number = 8; // internal to opticalsystem Vector angular_velocity = 12; }
Note that object_name
corresponds to the custom attribute trackerName
that we set for each Antilatency tracker. You can filter the objects that you would like to follow using this identifier.
Fused HMD Poses
message FusedPose { string object_name = 6; int64 timecode = 5; // Optional: if 0 not set. int64 timestamp = 1; Vector position = 2; Quaternion orientation = 3; Vector angular_velocity = 4; }
Python Resources
Download a Python example from this repository.
Prerequisites can be installed in your Python 3 environment with this:
pip install zmq pip install protobuf
Make sure to set the input port in FusionHubPythonExample.py correctly. For example for the Antilatency source definition like below, the port needs to be set to 8899
.
"optical": { "type": "Antilatency", "settings": { // Use this for access from an external process eg. ALVR "endpoint": "tcp://*:8899", // Use this for internal access eg. sensor fusion //"endpoint": "inproc://optical_data_source_1", "environmentLink": "AntilatencyAltEnvironmentHorizontalGrid~AgAEBLhTiT_cRqA-r45jvZqZmT4AAAAAAAAAAACamRk_AQQCAwICAgICAQICAAI", "placementLink": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" } },
C# Resources
On parsing Protobuf files: https://github.com/5argon/protobuf-unity
How to subscribe to ZeroMQ messages: https://github.com/gench23/unity-zeromq-client and https://tech.uqido.com/2020/09/29/zeromq-in-unity/
Outputting FusionHub Data via VRPN
In config.json make sure to add the following block to the sinks
section:
"VRPN": { "settings": { "inputEndpoints": [ "inproc://optical_data_source_1", "tcp://localhost:8899" ], "port": 3883, "deviceName": "FusionHub", "tracker0": "HMD" } },
Fused poses or optical data (or both) can be routed to the VRPN sink and are output as VRPN trackers. The output endpoint defined in the source settings needs to be included in the input endpoint array of the VRPN sink. In case of an Antilatency source, this could look like the following:
"optical": { "type": "Antilatency", "settings": { // Use this for access from an external process eg. ALVR "endpoint": "tcp://*:8899", // Use this for internal access eg. sensor fusion //"endpoint": "inproc://optical_data_source_1", "environmentLink": "AntilatencyAltEnvironmentHorizontalGrid~AgAEBLhTiT_cRqA-r45jvZqZmT4AAAAAAAAAAACamRk_AQQCAwICAgICAQICAAI", "placementLink": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" } },
tracker0
in the VRPN configuration refers to the trackerName
property in the ALT tracker settings. See the Controller Support section. Up to 10 trackers can be defined tracker0
, tracker1
, tracker2
etc. The tag in tracker0
(eg. "HMD"
) will be mapped to VRPN tracker ID 0 etc.