No order minimums | Exclusive pricing | Dedicated support

Self-localization and Map Generation for Mobile Robots Using Solid-State LiDAR (RealSense)

2025-09-26
  • 44

Light Detection And Ranging (LiDAR) plays a crucial role in SLAM (Simultaneous Localization and Mapping) and is considered one of the most important perception devices. While mechanical LiDAR was considered mainstream several years ago, recently MEMS (Micro-Electro-Mechanical Systems) type LiDAR has gained popularity as a cost-effective, lightweight solution, accelerating its adoption in small-scale robots.

This article introduces program source code for SLAM using the RealSense L515, a solid-state LiDAR camera that incorporates Intel's proprietary MEMS mirror scanning technology. The program source code recently published on GitHub has been demonstrated to provide more accurate localization and higher-quality mapping compared to using Intel's standard programs, offering improved precision and efficiency. We hope this serves as a valuable reference for those utilizing SLAM.

This article references information from RealSense GitHub SSL_SLAM / SSL_SLAM2

Lightweight 3-D Localization and Mapping for Solid-State LiDAR (Intel Realsense L515 as an example)

https://github.com/wh200720041/ssl_slam

https://github.com/wh200720041/ssl_slam2(update information)

About the SLAM Program Source Code

Compared to mechanical LiDAR, solid-state LiDAR offers higher scanning frequency and angular resolution but has a limited field of view (FoV), which can create uncertainty when using existing LiDAR SLAM algorithms. This new sensing device requires more robust and computationally efficient SLAM methods.

To address this need, this program source code proposes a new SLAM framework for solid-state LiDAR that includes feature point extraction, odometry estimation (movement distance and rotation angles), and map building. The proposed method has been evaluated on warehouse robots and handheld devices.

About the Equipment and Technology Used

What is the Intel RealSense Solid-State LiDAR Camera L515?

The ultra-compact, high-resolution LiDAR depth camera "Intel RealSense LiDAR Camera L515" (hereinafter L515) is a small, highly versatile depth camera that incorporates Intel's proprietary MEMS (Micro-Electro-Mechanical Systems) mirror scanning technology, one type of solid-state technology. It is ideal for developing indoor applications that require high-resolution, high-precision depth data. Since its release, it has been incorporated into various research and development projects to leverage its characteristics.

For example, it has been used in telepresence robots, automated guided vehicles (AGVs) specialized for automated transportation and carrying tasks, and as introduced in our "DIM Weight software for easily measuring box sizes," it shows promise for use in warehouse storage of goods and products, as well as logistics.

youtu.be

In addition to its functionality, the L515's compact size (about the size of a tennis ball) and weight of only 100g (about the weight of a convenience store rice ball) make it less constrained by weight limitations when mounted on mobile robots, and it rarely causes issues where the camera itself becomes an obstacle. Furthermore, its refined design has received high praise for not spoiling the aesthetics of installation locations.

About MEMS Technology

The MEMS technology installed in the L515 stands for Micro-Electro-Mechanical System. It refers to devices and systems such as sensors, actuators, and microelectronics that are manufactured using microfabrication technology to achieve miniaturization and high performance, integrating small electrical and mechanical components on a single substrate. MEMS' microscopic three-dimensional structures can handle a wide variety of input/output signals while consuming low power, drawing significant attention.

0edb4a95-3353-4fbb-831d-081768d799fc.png

About AGV

AGV (automatic guided vehicle) is an unmanned transport cart that automatically travels along designated routes to carry objects. In contrast, AMR (autonomous mobile robot) is a transport robot capable of autonomous navigation that can automatically avoid obstacles such as people and objects. Both are gaining attention for automating transport tasks to address social challenges like labor shortages and workload reduction.

About the Executable File SSL-SLAM / SSL-SLAM2 Framework

The code we introduce implements the paper "Lightweight 3-D Localization and Mapping for Solid-State LiDAR" published in IEEE Robotics and Automation Letters, 2021.

If you wish to enable save map (map saving) and test localization (localization test) separately, please refer to SSL_SLAM2 (extended work of SSL-SLAM) and separate the mapping and localization modules.

Supported languages: C++, CMake

Modifier: Wang Han, Nanyang Technological University (NTU), Singapore

1. Solid-State LiDAR (Example)

1-1. Scene Reconstruction (Example)

1-2. 3D Building Model Construction Using SfM (Structure from Motion) (Example)

1-3. Localization and Mapping Framework Using L515

2. Preliminary Preparation (Prerequisites)

2-1. Operating System

Ubuntu 64-bit 18.04 with ROS Melodic installation*

*The compatible Ubuntu version for ROS packages varies by version.

※For installation of the open-source Robot Operating System (ROS), see here

2-2: Ceres Solver

For installation of the efficient nonlinear optimization library Ceres, see here

2-3. PCL

For installation of the Point Cloud Library (PCL), see here

※Tested with version 1.8.1

2-4. OctoMap

For installation of OctoMap for SLAM map representation for robots and autonomous vehicles, see here

sudo apt-get install ros-melodic-octomap*

2-5. Trajectory Information for Visualization

For visualization purposes, this package uses the hector trajectory server. You can install the package with:

sudo apt-get install ros-melodic-hector-trajectory-server

Alternatively, if you don't need trajectory visualization, you can remove the hector trajectory server node.

3. Build

3-1. Clone the Repository

cd ~/catkin_ws/src
git clone https://github.com/wh200720041/ssl_slam.git
cd ..
catkin_make
source ~/catkin_ws/devel/setup.bash

3-2. Download ROSbag for Testing

If you don't have an L515, you can download recorded test data (approx. 5GB).

Extract the file directly under [home/user/Downloads (default)].

cd ~/Downloads
unzip ~/Downloads/L515_test.zip

3-3. Launch ROS

You can run this when you want to create a map simultaneously:

roslaunch ssl_slam ssl_slam_mapping.launch

Or to create position estimation (environment map probability):

roslaunch ssl_slam ssl_slam_octo_mapping.launch

If you only need localization, you can run:

roslaunch ssl_slam ssl_slam.launch

4. L515 Setup

If you have an L515, please follow these setup steps.

4-1. Librealsense

For installation of Librealsense, see here

4-2. Realsense_ros

Copy the realsense_ros package to your catkin folder:

cd ~/catkin_ws/src
git clone https://github.com/IntelRealSense/realsense-ros.git
cd ..
catkin_make

4-3. Launch ROS

roslaunch ssl_slam ssl_slam_L515.launch

This will run ssl_slam_mapping.launch with live data from the L515.

5. About Citation

If you use this work (SSL_SLAM/SSL-SLAM) in your research, we recommend citing the following papers. We would appreciate your citation.

5-1. SSL_SLAM Framework

@article{wang2021lightweight,
  author={H. {Wang} and C. {Wang} and L. {Xie}},
  journal={IEEE Robotics and Automation Letters}, 
  title={Lightweight 3-D Localization and Mapping for Solid-State LiDAR}, 
  year={2021},
  volume={6},
  number={2},
  pages={1801-1807},
  doi={10.1109/LRA.2021.3060392}}

5-2. SSL_SLAM2 Framework

@article{wang2021lightweight,
  author={H. {Wang} and C. {Wang} and L. {Xie}},
  journal={IEEE Robotics and Automation Letters}, 
  title={Lightweight 3-D Localization and Mapping for Solid-State LiDAR}, 
  year={2021},
  volume={6},
  number={2},
  pages={1801-1807},
  doi={10.1109/LRA.2021.3060392}}

Finally

The Intel LiDAR Camera L515 mentioned in this article is available through our company.