A ROS2 package for calculating intrinsic and extrinsic calibration between camera and LiDAR sensors. This repository provides an intuitive workflow to fuse data from these sensors, enabling precise projection of LiDAR points into the camera frame and offering an efficient approach to sensor fusion.
Static Sensors | Moving Sensors |
---|---|
![]() |
![]() |
If you find this project helpful and want to support its ongoing development, you can buy me a coffee! Every contribution helps me dedicate more time to improving and maintaining open-source software.
To run this package, ensure the following dependencies are installed:
- Git: For version control and repository management.
- Docker: To streamline the environment setup and execution.
- NVIDIA Container Toolkit (if using an NVIDIA GPU): For hardware acceleration.
Start by cloning the repository:
git clone [email protected]:CDonosoK/ros2_camera_lidar_fusion.git
This repository includes a pre-configured Docker setup for easy deployment. To build the Docker image:
- Navigate to the
docker
directory:cd ros2_camera_lidar_fusion/docker
- Run the build script:
This will create a Docker image named
sh build.sh
ros2_camera_lidar_fusion
.
Once built, launch the container using:
sh run.sh
This package includes the following ROS2 nodes for camera and LiDAR calibration:
Node Name | Description | Output |
---|---|---|
get_intrinsic_camera_calibration.py |
Computes the intrinsic calibration of the camera. | Camera intrinsic calibration file. |
save_sensor_data.py |
Records synchronized data from camera and LiDAR sensors. | Sensor data file. |
extract_points.py |
Allows manual selection of corresponding points between camera and LiDAR. | Corresponding points file. |
get_extrinsic_camera_calibration.py |
Computes the extrinsic calibration between camera and LiDAR sensors. | Extrinsic calibration file. |
lidar_camera_projection.py |
Projects LiDAR points into the camera frame using intrinsic and extrinsic calibration parameters. | Visualization of projected points. |
Follow these steps to perform calibration and data fusion:
-
Intrinsic Calibration
Runget_intrinsic_camera_calibration.py
to generate the intrinsic calibration file for the camera. -
Data Recording
Usesave_sensor_data.py
to capture and save synchronized data from the camera and LiDAR. -
Point Correspondence
Executeextract_points.py
to manually select corresponding points between camera and LiDAR. -
Extrinsic Calibration
Runget_extrinsic_camera_calibration.py
to compute the transformation matrix between camera and LiDAR. -
LiDAR Projection
Uselidar_camera_projection.py
to project LiDAR points into the camera frame for visualization and analysis.
To execute a specific node, use the following command:
ros2 run ros2_camera_lidar_fusion <node_name>
For example:
ros2 run ros2_camera_lidar_fusion lidar_camera_projection.py
This package is maintained by:
Clemente Donoso
Email: [email protected]
GitHub: CDonosoK
This project is licensed under the MIT. See the LICENSE file for details.
Contributions and feedback are welcome! If you encounter any issues or have suggestions for improvements, feel free to open an issue or submit a pull request.