LP-MOCAP is a software package that provides IMU-based upper-body motion capture. IMU is shorthand for Inertial Measurement Unit, devices which provide high-frequency updates on their motion. LP-MOCAP combines data from several IMUs attached to a test subject’s upper body with a skeleton model to derive high-frequency information on the subject’s motion. Unlike camera-based motion capture systems, no specially equipped measurement site or external hardware is necessary. Several IMUs are attached to the subject’s body, their relative orientation is determined by a calibration procedure and measurement can begin.

LP-MOCAP allows tracking of one arm as well as the head and upper body. Skeleton data is updated at a frequency of 50Hz. This manual will describe the various steps of using the software in detail.


After installation, LP-MOCAP can be found in the OpenMAT folder of the Windows Start Menu. Figure 1 shows the application window after starting the application. The window is divided into two main parts: on the left, various configuration and control items are shown; on the right, the motion capture result is visualized. Additionally, the status of the application is indicated at the bottom of the window.

Figure 1 - LP-MOCAP control application user interface

In order to capture motion data, the type of motion capture and the sensors to use as well as their position on the subject’s body must be specified. LP-MOCAP can track one arm plus the upper body. The upper body can be configured to consist either of sternum and head, or of sternum and hip.

Figure 2 - The human upper body model of LP-MOCAP allows flexible configuration of sensor placement

Figure 3 - LPMS-B2 sensors can be freely assigned to different parts fo the human model

These settings are configured in the top-left section of the application window which is reproduced in Figure 2 and Figure 3. Besides choosing the type of model, the individual IMUs can be selected here and various settings, discussed below can be configured here. Before using the application, the IMUs need to be calibrated so as to correctly determine the relation between their orientations and the corresponding body parts. Once this is done, data can be recorded. The controls for this are found in the lower part of the left-hand side of the window and are reproduced in Figure 4. These functions will be discussed in further detail below.

Figure 4 - The Operation dialog allows the user to start / stop motion capture, start the calibration procedure and start / stop data recording

Motion capture setup

Attaching the IMUs

The IMU sensors must be attached to the body parts partaking in motion capturing. They should be positioned close to the major bones corresponding to the relevant body parts using the provided straps as in Figure 5.

  • Head: somewhere on the skull

  • Hip: close to the pubic or pelvic bones, as well as near the sacrum

  • Sternum: on the front of the chest

  • Upper Arm: somewhere near the humerus, a convenient point is near the lower end of the deltoid muscle

  • Lower Arm: the ulnar bone is close to the surface near the wrist

  • Hand: the back of the hand

Figure 5 - LPMS-B2 is inserted into a holder strap that is attched to the user's body

Figure 6 - Typical arrangement of LPMS-B2 sensors on user's body (hip sensor not shown)


Figure 7 - Arrangement of LPMS-B2 sensors including the hip sensor

Software setup

Before recording motion capture data, the motion capture model and the corresponding sensors need to be configured. Figure 3 shows the controls for these settings. Each sensor is identified by a bluetooth address which is a string of six pairs of hexadecimal digits, separated by colons. Typically, for LPMS IMUs, these have the form 00:04:3e:aa:bb:cc. The values for each sensor can be found out in our LpmsControl software. It is convenient to write the Bluetooth addresses and the intended uses of the sensor on stickers which are attached to the sensors.

In Figure 3, an example setting for the hand is shown. After clicking the checkbox to activate this sensor, the bluetooth address can be entered. Malformed bluetooth addresses are shown in red, valid addresses in black. The entered addresses are remembered between sessions.

After entering the addresses of the various sensors, the user can click the Connect button which is near the top of the “Sensors” box. If the entered data is correct, LP-MOCAP should connect to the individual sensors and in the right half of the windows, visualization should commence. Before calibration, the visualization will at best bear superficial resemblence.

Additional settings

The sensors can be operated in a variety of modes. First, the accelerometer and gyroscopes can be operated in various ranges. Outside of special cases the default settings, which limit the sensors to recording acceleration up to 4g and angular velocities up to 2000deg/s are appropriate for human motion.

Figure 8 - The configuration detail dialog allows setting specific filter parameters for each sensor

Accelerometer Correction Generally, gyroscope-based orientation estimation will accumulate errors over time, the orientation estimation will drift, and the estimated pose will diverge from the proper pose over time. There are several ways to mitigate these by combining the gyroscope with data from other sensors. The IMUs contain a 3-axis gyroscope, a 3-axis accelerometer and a 3-axis magnetometer. The equivalence principle from physics implies that the accelerometer also measures the downward pull of gravitation. Since this force is locally constant, over time the accelerometer measurements average to the gravitational vector, and this fixes one direction in space. The gyroscope measurements and the accelerometer measurements can thus be combined, and long-term drift can be restricted to the plane orthogonal to gravity, the so-called yaw angle.

Figure 9 - In the model settings tab of the Configuration dialog the user can set the lengths of the limbs of the human model, as well as change certain parameters of the calibration procedure

Magnetic Field Correction Nature provides terrestrial beings with a second source of directional information which is locally constant: the earth’s magnetic field, which give us a direction which we can call “North” for sake of clarity. Unfortunately, this cannot be used as is: locally, ferromagnets as well as electrical currents produce additional magnetic field. In particular, materials inside the sensor case, the test subject’s wristwatch, or their pen may produce a magnetic field. Some of these field sources move along with the sensor, some are fixed in space. The field measured by the sensor, if it is oriented according to the rotation matrix , will be composed from two parts , the environmental magnetic field which, from the sensor’s perspective, rotates in the opposite direction as the sensor, and the corotating part that rotates along with the sensor. Only contains information on the orientation of the sensor. In order to suppress gyroscope drift based on the magnetic field, the contribution has to be isolated. If the sensor is rotated in place, the measured magnetic field vectors describe a circle whose radius is given by and whose center is given by . In this way a calibration can be performed. This calibration must be performed for each sensor individually, by pressing the button labelled “Mag Calib” next to its Bluetooth address. The calibration data will be stored on the sensor and will remain valid as long as the magnetic fields don’t change.

The magnetic field calibration consists of two steps: first the user is asked to rotate the sensor in place for a number of seconds, and the data gathered during this step is used to perform the calibration as outlined above. In the second step, the sensor is held steady and it measures the inclination of the outside magnetic field, in order to correctly relate it to the gravitational pull. Using the direction of the external magnetic field and the gravitational acceleration vector then allows, in theory, drift-free operation of the sensor.

Filter Settings In order to make use of the various sources of information, a variety of filtering techniques can be applied. The two most useful most for motion capture are “Gyr + Acc (Kalman)”, when the magnetic field is not calibrated or nearby ferromagnets, strong, switching electrical currents etc. make its use impossible, or “Gyr + Acc + Mag (DCM)” which relies on the complete set of sensor data.

Motion capture calibration

After connecting to the sensor. upon clicking the calibration button, the calibration procedure will start. It will perform a different set of steps depending on which body parts are enabled in the motion capture. Each calibration step consists of sequence of poses which the subject will have to strike and hold for a few seconds. These poses are described on screen during the procedure. We recommend performing the calibration while standing and facing the same direction throughout.

Arm calibration

The arm calibration calibrates the sensors attached to the arm and hand. For this, two poses are taken: the T pose and the A pose, which are illustrated in Figure 10 and Figure 11. Each pose will have to be held a few seconds in order to accumulate data for the calibration.

Figure 10 - Calibration T-pose; the palms of the hands should be facing downwards (thumbs forward).

Figure 11 - Calibration A-pose; same as with the T-pose the palms of the hands should be facing downwards (thumbs forward).

Upper-body calibrations

The upper body calibration is used for the alignment of the hip, and sternum sensors. In it the subject is asked to stand upright for a few seconds (Figure 12) and then to lean forward slightly and to hold that position for a few seconds in the way that is shown in Figure 13. The head sensor is calibrated independently, in it the head has to be held upright and then inclined forward in much the same way.


Figure 12 - Calibration neutral pose


Figure 13 - Calibration bowing pose

Figure 14 - Calibration head nod pose

Motion capture visualization

Once the motion capture is calibrated and running, the visualization pane of the main window displays a real-time model of the motion-capture data. As an example, Figure 15 shows the situation during motion capture of the upper body with five IMU sensors. The global coordinate frame is indicated as well as the local coordinate frames corresponding to the orientations of each limb. The limbs are indicated by cylinders and the joints of the model are indicated by spheres.

Figure 15 - Example display of upper-body motion-capture data without further configuration

The view can be customized in several ways. Each limb can be given a color on the settings tab of the configuration pane on the left part of the application window. Additionally, the visualization can be modified in the following ways:

  • Rotation: left click into the visualization window, then drag while keeping the mouse button pressed

  • Pan: right click into the visualization window, then drag while keeping the mouse button pressed

  • Zoom: rotate the mouse wheel

In Figure 16 (latest software version as in Figure 17) the same motion capture scene as above is depicted with modified settings. Shown here is a move from the dance known as the Egyptian Walk.

Figure 16 - Customized view of the same scene (old software)

Figure 17 - Selection dialog to adjust colors of human model

Motion capture recording

Motion capture data can be recorded by clicking the button labelled “Start Recording.” This will automatically create a file in the log subdirectory of the path where LP-MOCAP is installed, typically C:\OpenMAT\LpMocap-x.x.x\bin\log (x.x.x being the version number). The file will be labelled by the date and the time the recording started, and by whether the right or the left arm was selected for motion capture. A typical filename will be 20190215-151919_left_arm.csv for a file recorded on Feb 15th, 2019, starting at 15:19:19.

The data is recorded in a CSV file, where after a line of column headers, each row will contain a set of simultaneously recorded data. The set of columns will depend on the selected limbs, but it will be remain for each recording. For each limb the same set of data will be recorded. Please refer to Error! Reference source not found. for an overview of the fields.

The instantaneous pose of each limb is encoded in the four quaternion components labelled GlobalQuat and the three vector components GlobalPos. The global coordinate frame is a right-handed frame such that +Y points upwards and the user is facing towards the +Z direction. The quaternions encode the deviation of the limb from a null pose which for the arms is assumed to be parallel to -X (on the right) and +X (on the left), for the trunk and head the null orientation is along +Y.

For instance, if the upper arm GlobalQuat is the unit quaternion q = 1, the right hand is tracked, and the upper arm length is configured to be 30cm, then the elbow is situated 30cm along the -X direction from the shoulder. If the quaternion is , then the upper arm points forward along the +Z direction etc.

Table 1 - Recorded data for each body part

Body part

Recorded data


  • SensorId, Timestamp

  • AccX, AccY, AccZ

  • GyroX, GyroY, GyroZ

  • GlobalQuatW, GlobalQuatX, GlobalQuatY, GlobalQuatZ

  • GlobalPosX, GlobalPosY, GlobalPosZ


Upper arm

  • SensorId, Timestamp

  • AccX, AccY, AccZ

  • GyroX, GyroY, GyroZ

  • GlobalQuatW, GlobalQuatX, GlobalQuatY, GlobalQuatZ

  • GlobalPosX, GlobalPosY, GlobalPosZ


Other limbs

  • SensorId, Timestamp

  • AccX, AccY, AccZ

  • GyroX, GyroY, GyroZ

  • GlobalQuatW, GlobalQuatX, GlobalQuatY, GlobalQuatZ

  • GlobalPosX, GlobalPosY, GlobalPosZ


In the following table the components of the global displacement vectors corresponding to the various limbs in their null orientation are given:

Table 2 - Null displacement for the various body parts. The values correspond to the labels in the GUI

Body Part






Hand Length



+on left, -on right

Lower Arm




+on left, -on right

Upper Arm

Upper Arm Length



+on left, -on right



Neck Length





-Torso Height / 2





Torso Height / 2




The CSV file contains additional columns for accelerometer and gyroscope readings, but since these are given in sensor coordinates, and no translation from sensor to global is provided, these values are useless. This is a bug that was found in a previous version of this manual. A future version will correct this bug.

Version History






Initial version



Updated text and images to latest software version



Changed document title

Updated missing references

About Us

This manual was created by LP-RESEARCH Inc. If you have any feedback or questions, please contact us.






#303 Y-Flat, 1-11-15, Nishiazabu, Minato-ku, Tokyo, 106-0031 Japan