Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Accurate and reliable global positioning of moving vehicle is of great importance across many domains. Here, Global Navigation Satellite Systems (GNNS) like GPS and Galileo provide ways to locate vehicles worldwide. However, there are applications where either the GNNS’s update rate, precision, stability or reliability are not sufficient or other positioning systems, like optical ones, are available and need to be integrated to achieve the best possible quality of the estimated location.

LP Global Fusion LPNAV is a software system for global vehicle positioning, movement estimation and orientation stabilization and which supports multiple application domains.  At the core of this system is an algorithm which performs the combination of sensor data from different sources to form a consistent and accurate estimation of a vehicle’s position, velocity and trajectory. This process is called sensor fusion and the accurate modelling of the vehicle’s dynamics and steering system allows for the prediction of the vehicle’s movement parameters if sensor data is not available or unreliable.Figure 1 Overview

...

The figure shows an overview of the LP Global Fusion LPNAV system with various sensor data sources and the ouput data which is computed. An additional Post Processing step improves the visual stablity of these output values for VR and AR applications.

Figure 1 shows an overview of the LP Global Fusion LPNAV input, processing and output. Multiple heterogenous sensors provide measurement data related to the vehicle’s state. The LP-Research LMPS-IG1 sensor combines a high-quality and high-rate Inertial Measurement Unit (IMU) and a GNSS receiver in one compact package. It serves as the backbone to the LP Global Fusion LPNAV system, as it provides a global position using the GPS, Galileo and GLONASS satellite constellations. Furthermore, information on the velocity and driving direction can be extracted from the satellite navigation systems by comparing multiple measurements.

...

These core measurements can be enriched by additional data sources available in a specific application scenario. If the vehicle’s CAN bus can be accessed, the current wheel velocity and the steering angle of the wheels can be read out. This information can improve the movement prediction if no updated global position measurement is available. Additional data sources like optical marker-based tracking systems and Lidar can also be used to improve the estimation of the vehicle system state.

Application Areas

LP Global Fusion LPNAV can be used in the following application areas:

In-car virtual reality or augmented reality

...

Indoor / outdoor positioning and dead-reckoning during signal unavailability

Due to poor satellite reception in indoor areas, other methods like optical or radio beacon systems are used to locate vehicle. In many scenarios, not all areas of the buildings can be properly covered by these indoor tracking systems, for example in between storage racks in warehouses.  Here, LPNAV can be used to combine the indoor tracking information with a vehicle’s CAN bus data to provide a continuous, high-frequency tracking for vehicles throughout the facility.

In-car virtual reality or augmented reality

Vehicle-based virtual and augmented reality applications have a high requirement on the visual stability and low-latency on the position and orientation estimation (in short tracking) of all components involved. On the one hand, this is the user’s VR or AR headset or additional tracked objects inside the car. On the other side, if the VR and AR content should also integrate with external objects like traffic signs or street markings, the car needs to be localized in global space and a stable and reliable orientation solution for the car needs to be computed for the car, which can accommodate the cars movement and compensate for tilting of the suspension. Furthermore, these position and orientation estimations need to be provided with a high update rate of 100 Hz or more to give a good visual impression when viewed in AR and VR headsets. Here, LP Global Fusion can combine low-frequency (~1 Hz) GNSS and CAN bus measurements with high-rate IMU (> 100 Hz) measurements to provide a high-quality global position and orientation solution with rates of 100 Hz.

Indoor Positioning and Dead-reckoning during signal unavailability

Due to poor satellite reception in indoor areas, other methods like optical or radio beacon systems are used to locate vehicle. In many scenarios, not all areas of the buildings can be properly covered by these indoor tracking systems, for example in between storage racks in warehouses.  Here, LP Global Fusion can be used to combine the indoor tracking information with a vehicle’s CAN bus data to provide a continuous, high-frequency tracking for vehicles throughout the facilityto be localized in global space and a stable and reliable orientation solution for the car needs to be computed for the car, which can accommodate the cars movement and compensate for tilting of the suspension. Furthermore, these position and orientation estimations need to be provided with a high update rate of 100 Hz or more to give a good visual impression when viewed in AR and VR headsets. Here, LPNAV can combine low-frequency (~1 Hz) GNSS and CAN bus measurements with high-rate IMU (> 100 Hz) measurements to provide a high-quality global position and orientation solution with rates of 100 Hz.

Data Sources

The basis measurements for the LP Global Fusion LPNAV is provided by the LP-Research LPMS-IG1 combined IMU and GNSS unit. Additional data sources can be used improve the result of the sensor fusion:

...

For example, the vehicle’s wheel steering direction can be read from the CAN bus and is used by LP Global Fusion LPNAV to improve the position prediction when known.

...

Units and Uncertainty

If not otherwise stated, LP Global Fusion LPNAV uses the SI base units: meter, second etc. All angles are given in radian. All uncertainties are given as the one sigma width under the assumption that the errors follow a gaussian distribution.

Coordinate Systems

LP Global Fusion LPNAV can use both global positions and local positions as an input and output of the sensor fusion. Internally, all computations are performed using local positions. Depending on the case in which LP Global Fusion LPNAV is employed, it may be easiest to only use global positions as input and output, only use local positions or a combination of both.

...

·         Global Coordinate System

This type of representation can express locations on the whole global with in one coordinate system. LP Global Fusion LPNAV uses the World Geodetic System 84 (WGS 84) which approximates the earth as an ellipsoid and uniquely expresses a three-dimensional location on this ellipsoid in latitude, longitude and height above the ellipsoid. The same coordinate system is used by the GPS system and popular online maps like Google Map and Openstreetmap.org. Therefore, the global positions from LP Global Fusion LPNAV can be directly used with these services and vice versa.

...

For all positions which are fixed to the vehicle frame, the in-vehicle position coordinate system is used. Its origin is in the center of the front axle, so right between both front wheels. The coordinate system is right handed, the negative x axis points in the direction of travel and the positive y axis to the right side of the vehicle.                

Orientation

The orientation of the vehicle is represented by a Quaternion. With zero rotation around the Y axis around the vehicle’s Yaw axis, the vehicle is pointing straight to the north pole. With pi/2 rotation, it is pointing towards the east. The rotation around the vehicle’s roll axis is represented by a rotation around the X axis, where a rotation of pi/2 means the vehicle is tilted 90° to the right. Rotations around the vehicle’s pitch axis are represented by rotation around the Z axis.

...

For all these purposes, one vehicle model is used. This steering model is based on the Ackermann steering geometry and assumes that only the forward wheels are steerable. The main assumption of the Ackermann model is that while steering around a corner, all wheel hubs meet in one point. This point Pr is the rotation center of the car and the distance to the individual wheels are the curve radii.

...

In the Ackermann model, each wheel rotates with a different curve angle R around the rotation point Pr and  the steerable wheels each have their own steering angle ϕfl and ϕfr. Two important vehicle parameters influencing the drive dynamics are the track width Lt and the wheel base Lwb.

...

To compute the curves radius of this virtual front center wheel, the following formula can be used:

...

Because the angular velocity when driving a curve is the same at all four wheels and at the virtual front center wheel, the wheel velocities measured at four wheels individually can be projected to one position (for example the virtual front center wheel) to perform the sensor fusion in one point.

...

  • Vehicle Orientation (on by default)

 

Name  Name of the data source in configuration file and FusionManager.addDataSource() call: I1g

Configuration Parameters

Parameter Name

Type

Default

Description

sensor_name

String

  • no default -

The name of the IG1 to connect to. The sensor name can be configured by commissioning tools and the software only connects to a sensor with the exact same name.

retry_count

Integer

10

The number of times an attempt is made to connect the sensor.

connect_remote

String

  • no default -

If this value holds some URL (for example “tcp://localhost:9982”) the data of a remote IG1 will be streamed over the network to this sensor fusion instance.

use_gps

Boolean

true

If true, the IG1 GPS will be used in the sensor fusion

use_gps_orientation

Boolean

true

If true, the car heading information provided by the IG1 GPS will be used in the sensor fusion

use_imu

Boolean

true

If true, the information on acceleration and turn velocities will be measured by the IG1 and used in the sensor fusion.

use_pressure

Boolean

false

If true, the measured ambient pressure will be used to estimate the height of the vehicle.

pressure_sigma

Float

100

The error estimate used for the pressure measurement in Pascal.

imu_position

In-Vehicle Position

(0,0,0)

Location of the IG1 in the vehicle coordinate frame

imu_orientation

In-Vehicle Orientation Quaternion

(1,0,0,0)

Orientation of the IG1 in the vehicle coordinate frame. In the default orientation, the IG1’s and vehicle coordinate axis are the same and the IG1’s connector cable are pointing towards the back of the car.

reference_to_optical_quat

Orientation Quaternion

  • no default -

Rotation transform between the IMU frame and the optical tracking frame from an optical tracking marker installed on the IG1 case. This value is determined as part of the LPVR calibration procedure (same name in LPVR) and can be directly plugged in here.

If this value is set, reference_orientation_quat must also be set and imu_orientation must not be set.

reference_orientation_quat

Orientation Quaternion

  • no default -

Rotation of an optical tracking marker installed on the IG1 case. This value is determined as part of the LPVR calibration procedure (same name in LPVR) and can be directly plugged in here.

If this value is set, reference_to_optical_quat must also be set and imu_orientation must not be set.

gnss_antenna_position

In-Vehicle Position

(0,0,0)

The position of the GNSS antenna connected to the IG1.

 

LpSlam  LpSlam Data Source

This data source will read and process the video stream from a mono or stereo-camera system to create a map of the environment and determine the vehicle position and orientation of the vehicle in this environment.

...

Parameter Name

Type

Default

Description

camera_angle

Float

0

Up or down tilt angle of the camera in degrees. The sign is negative if the camera is tilted up and positive if looking down. If the camera is looking straight along the vehicle’s forward direction, this angle is 0.

camera_orientation

Orientation Quaternion

(1,0,0,0)

Can be used instead of the camera angle if the camera is rotated in more axis than just the tilt angle.

camera_position

In-Vehicle Position

(0,0,0)

Position of the camera in relation to the vehicle’s fixed coordinate origin.

connect_remote

String

  • no default -

If this value holds some URL (for example “tcp://localhost:9982”) the data of a remote IG1 will be streamed over the network to this sensor fusion instance.

use_position

Boolean

true

If true, the local position provided by the SLAM alogrithm will be used by the  sensor fusion.

use_orientation

Boolean

true

If true, the orientation measurement will be used by the sensor fusion.

config_file

String

  • empty  -

Path to the config file which contains the camera calibration and SLAM parameters.

forward_fusion_state

Boolean

false

If true, the current sensor fusion result will also be available and used by the SLAM alrogithm

write_logfile

Boolean

false

If true, a logfile named “Lpslam.log” will be written with information about the SLAM processing.

log_verbose

Boolean

false

If true, the logfile will contain more verbose logging output.

...

Installation and Calibration of the System

...

Returns the data structure which contains information about the performance (frames per seconds and frame processing time) and whether the localization using the mapped data is active or not.

Version History

Version

Date

Changes

1.0

2019/2/10

Initial version

1.1

2019/7/19

Added data source documentation

1.2

2020/09/20

Added documentation for camera SLAM

...

See the folder licenses in the release archive for details on the conditions of the used Open Source Licenses.

 

About  About Us

This manual was created by LP-RESEARCH Inc. If you have any feedback or questions please contact us.

Email:

info@lp-research.com

Tel:

+81-3-6804-1610

Address:

#303 Y-Flat, 1-11-15, Nishiazabu, Minato-ku, Tokyo, 106-0031 Japan