Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Camera Calibration: Ensure you have proper camera calibration data. Information on camera calibration can be found /wiki/spaces/LD/pages/2157314049.

  • Configuration File: A JSON file containing camera calibration parameters and other Visual-SLAM settings. Refer to this page for a complete list of available parameters.

...

LPSLAM Controller app

...

Locating and running the app

Navigate to the bin directory within your LPSLAM build folder. The typical path would be:

Code Block
lpgf-slam/out/build/x64-Release/bin/

lpslam_binaries folder:

...

Within this directory, you should find the executable file named lpslam-standalonegui.exe.

View available options

To see a list of all available command-line arguments and their descriptions, run the following command from your CLI:

Code Block
./lpslam-standalone.exe --help

This will display a detailed explanation of each option, including:

Code Block
Allowed options:
  --help                Show usage information
  --config arg          Load a configuration file
  --use-preset arg      Load a known configuration preset
  --list-presets        List all known configuration presets
  --show-preset arg     Show a known configuration preset
  --logfile arg         Log output to a file
  --replay arg          Replay data from a file
  --record              Record all data during operation
  --record-no-video     Record only sensor and tracking data (without video)
  --show-live           Display the camera feed during operation
  --store-images        Save captured images from the camera
  --verbose             Enable basic verbosity level
  --verbose-debug       Enable detailed verbosity level

Run the app

Assuming your configuration file is named config.json and located in the same directory as the executable, you can run LPSLAM with the following command:

Code Block
./lpslam-standalone.exe --config config.json
Note

Remember to replace config.json with the actual name of your configuration file (including its path, if necessary) and properly refer to the executable from your working directory.

Stop the app

To stop LPSLAM, focus on your CLI press the ENTER key. This will automatically stop all running processes, allowing the generated 3D map to be saved correctly (if use_map_database is set to true in the configuration file).

Running the GUI controller app

Locate the app

Navigate to the bin directory within your LPSLAM build folder. The typical path would be:

Code Block
lpgf-slam/out/build/x64-Release/bin/

Within this directory, you should find the executable file named lpslam-gui.exe.

Run the app

You can run the controller app by executing it from your CLI or simply double-clicking on it. The following GUI will appear:

...

Usage

Unlike the standalone app, the LPSLAM Controller app has fewer execution options. It is a streamlined version intended to for real-time use, simply controlling the execution of LPSLAM.

1) Setting the verbosity

Using the verbosity buttons, you can set the desired verbosity level for LPSLAM, indicated by the corresponding icons:

  • error.pngImage Removed → Only ERROR log messages are show.

  • info.pngImage RemovedERROR + INFORMATION log messages are show.

  • debug.pngImage RemovedERROR + INFORMATION + DEBUG messages are show.

2) Loading a configuration

In the configuration section, you can load the desired file to configure LPSLAM. Simply click the LOAD load.pngImage Removed button, and a file explorer window will pop up for you to select your file.

...

3) Start or Stop tracking

Starting and stopping the tracking process in this app is straightforward; simply click the corresponding button:

  • start.pngImage RemovedSTART the tracking process.

  • stop.pngImage RemovedSTOP the tracking process.

Using a Stereolabs ZED2 stereo camera

The configuration shown below is an example that uses OpenCV as the back-end for the input device and the ZED2 stereo camera:

Code Block
languagejson
{
    "_comment": "Configuration for Stereolabs ZED 2 SN25661426 with 672x376 resolution",
    "manager": {
        "show_live": false,
        "thread_num": 4,
        "drop_frames": true
    },
    "cameras": [    
        {
            "number": 0,
            "model": "perspective",
            "resolution_x": 672,
            "resolution_y": 376,
            "fps": 15,
            "fx": 263.1325,
            "fy": 262.98,
            "cx": 333.895,
            "cy": 190.12275,
            "distortion": [
                -0.0398096,
                0.00724821,
                -0.000893297,
                -4.54703e-05,
                -0.00391994],
            "focal_x_baseline": 31.5530,
            "translation": [119.913, 0.00959959, -0.643623],
            "rotation_vec": [-0.00195593, -0.00113609, -0.000112217],
            "mask": "none"
        },
        {
            "number": 1,
            "model": "perspective",
            "resolution_x": 672,
            "resolution_y": 376,
            "fps": 15,
            "fx": 264.2975,
            "fy": 264.1525,
            "cx": 337.325,
            "cy": 193.21975,
            "distortion": [
                -0.0414599,
                0.010013,
                -0.000652239,
                -6.52452e-05,
                -0.00489542],
            "mask": "none"
        }
    ],
    "datasources": [
        {
            "type" : "OpenCV",
            "configuration": {
                "camera_number": 0,
                "grayscale": true, 
                "stereo_mode": "horizontal",
                "width": 672,
                "height": 376,
                "fps": 15
            }
        }
    ],
    "processors": [
        {
            "type": "Undistort",
            "configuration": {}
        },
    ]
    ,
    "trackers": [
        {
            "type": "VSLAMStereo",
            "configuration": {
                "camera_setup": "stereo",
                "live_view": true,
                "viewer_FPS": 15,
                "use_CUDA": false,
                "use_OpenCL": false,
                "vocab_file": ".../path_to_file/orb_vocab.dbow2",
                "save_status_file": false,
                
                "emit_map": true,
                "enable_mapping": true,
                "loop_closure": true,
                "map_file_name": "map.db",
                "use_map_database": false,
                
                "min_triangulated_pts": 30,
                "depth_threshold": 20.0,
                "max_keypoints": 1200,
                "scale_factor": 1.2,
                "num_levels": 8,
                "ini_keypoints_threshold": 20,
                "min_keypoints_threshold": 7,

                "forward_high_res_nav": false,
                "forward_IMU": false,
                "forward_nav_state": false,
                "reloc_with_navigation": false,
                "wait_for_navigation": false
            }
        }
    ]
}

The correct path to the file orb_vocab.dbow2 must be provided. This file contains a database used to assign identifiers to image features and is referenced by the vocab_file parameter in the VSLAM tracker configuration. Alternatively, you can copy it from lpgf-slam/data/orb/orb_vocab.dbow2 to your working directory.

Screenshot 2024-10-25 112335.pngImage Removed

To run LPSLAM with a ZED2 stereo camera, connect the camera to your computer and launch either the standalone app or the GUI controller with the appropriate configuration file. The LPSLAM Map Viewer window will then display the current image frame, the reconstructed 3D map, and the camera's estimated pose in space.

Note

If you run LPSLAM on a laptop, which typically has built-in cameras, you will need to disable any of these to ensure the ZED2 is the sole image source.

To deactivate your built-in cameras on Windows, open Device Manager, expand the Cameras section, right-click on your built-in camera, and select Disable device.

Using a Varjo XR-3 HMD

...

You can run the controller app by executing lpslam-gui.exe from your Command Line Interface or simply double-clicking on it. The following GUI will be displayed:

...

Usage

The controller app is a streamlined version intended to simply control the execution of LPSLAM in real-time.

1) Setting the verbosity

Using the verbosity buttons, you can set the desired verbosity level for the log messages LPSLAM will show in the terminal during its execution. The levels are:

  • error.pngImage Added → Only ERROR log messages will show.

  • info.pngImage AddedERROR + INFORMATION log messages will show.

  • debug.pngImage AddedERROR + INFORMATION + DEBUG messages will show.

2) Loading a configuration

In the configuration section, load the LPSLAM configuration file. Simply click the LOAD load.pngImage Added button, and a file explorer window will pop up for you to select your file. The configs folder has the pre-defined configuration lpslam_Varjo_XR3.json for Varjo XR-3 HMD.

...

3) Start or Stop tracking

Starting and stopping the tracking process in this app is straightforward; simply click the corresponding button:

  • start.pngImage AddedSTART the tracking process.

  • stop.pngImage AddedSTOP the tracking process.

Using a Varjo XR-3 HMD

To run LPSLAM with a Varjo XR-3 HMD, modify the configuration file (adjusting cameras and datasources) as shown in the example below.

Note

The correct path to the file orb_vocab.dbow2 must be provided. This file contains a database used to assign identifiers to image features and is referenced by the vocab_file parameter in the VSLAM tracker configuration. The file is in the vocabulary folder, and can be referenced in the configuration file with the path "../vocabulary/orb_vocab.dbow2":

image-20241203-024107.pngImage Addedimage-20241203-024152.pngImage Added
Info

When using a Varjo XR-3 HMD, the camera calibration process can be skipped by using a configuration file without the cameras field. LPSLAM will automatically retrieve the camera parameters at startup.

Code Block
languagejson
{
    "_comment" :  "CalibrationConfiguration for Varjo XR-3 in LPR office",
    "manager": {
        "drop_frames": true,
        "thread_num": 4,
        "show_live": false
    },
    "cameras": [
        {
            "number": 0,
            "model": "varjo",
            "fx": 1.18706476688385,
            "fy": 1.18706476688385,
            "cx": 0.532978355884552,
            "cy": 0.5058077573776245,
            "f_correction": 1.7,
            "c_correction": 2.0,
            "resolution_x": 1152,
            "resolution_y": 1152,
            "distortion": [
                0.6412177681922913,
                -0.8660373687744141,
                0.47459715604782104,
                -0.002288734307512641,
                -0.0012009869096800685
            ],
            "focal_x_baseline": 43.20569431080538,
            "rotation": [
                0.9999300244141561,
                0.009785937266698777,
                0.006646932159989518,
                -0.009785385681041777,
                0.9999521152969042,
                -0.0001155009611359007,
                -0.006647744158775169,
                5.045008421054563e-05,
                0.9999779022320736
            ]
        },
        {
            "number": 1,
            "model": "varjo",
            "fx": 1.2481434345245361,
            "fy": 1.2481434345245361,
            "cx": 0.4319934844970703,
            "cy": 0.4994443356990814,
            "resolution_x": 1152,
            "resolution_y": 1152,
            "distortion": [
                0.7144321799278259,
                -0.9300915002822876,
                0.5433128476142883,
                0.0014141352148726583,
                -0.0017072069458663464
            ],
            "rotation": [
                0.9999397368314507,
                -1.0797726242757499e-14,
                0.010978283367585984,
                -5.53855340397135e-07,
                0.9999999987273943,
                5.044704683978703e-05,
                -0.010978283353614959,
                -5.045008712176573e-05,
                0.9999397355589217
            ]
        }
    ],
    "datasources": [
        {
            "type": "Varjo",
            "configuration": {
                "prescale": 3
            }
        }
    ],
    "processors": [
        {
            "type": "Undistort",
            "configuration": {}
        }
    ],
    "trackers": [
        {
            "type": "VSLAMStereo",
            "configuration": {
              "camera_setup": "stereo",
              "use_CUDA": false,
              "use_OpenCL": false,
              "live_view": true,
              "viewer_FPS": 22,
              "vocab_file": ".../path_to_filevocabulary/orb_vocab.dbow2",
              
              "emit_map": true,
              "enable_mapping": true,
              "loop_closure": true,
              "map_file_name": "../map.db",
              "use_map_database": false,
              
              "undistort_image": false,
              
              "min_triangulated_pts": 30,
              "depth_threshold": 20.0,
              "max_keypoints": 1200,
              "scale_factor": 1.2,
              "num_levels": 8,
              "ini_keypoints_threshold": 20,
              "min_keypoints_threshold": 7,

              "forward_high_res_nav": false,
              "forward_IMU": false,
              "forward_nav_state": false,
              "reloc_with_navigation": false,
              "wait_for_navigation": false
            }
        }
    ]
}

...

With Varjo Base running in the background, launch either the standalone app or the GUI controller with your configuration file to start LPSLAM.

...