Skip to content

LuSeg: Efficient Negative and Positive Obstacles Segmentation via Contrast-Driven Multi-Modal Feature Fusion on the Lunar

License

Notifications You must be signed in to change notification settings

nubot-nudt/LuSeg

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LuSeg: Efficient Negative and Positive Obstacles Segmentation via Contrast-Driven Multi-Modal Feature Fusion on the Lunar

Shuaifeng Jiao, Zhiwen Zeng, Zhuoqun Su, Xieyuanli Chen Zongtan Zhou*, Huimin Lu

🔥News:

  • [2025-03] Code released.
  • [2025-02] Submitted to IROS 2025.

1. 🛰️Introduction

This repository introduces the Lunar Exploration Simulator System (LESS), a lunar surface simulation system, alongside the LunarSeg dataset, which supplies RGB-D data for the segmentation of lunar obstacles, including both positive and negative instances. Furthermore, it presents a novel two-stage segmentation network, termed LuSeg.

Our accompanying video is available at Demo

Alt text

Citation

Please cite the corresponding paper:

@article{jiao2024luseg,
  title={LuSeg: Efficient Negative and Positive Obstacles Segmentation via Contrast-Driven Multi-Modal Feature Fusion on the Lunar},
  author={Shuaifeng Jiao, Zhiwen Zeng, Zhuoqun Su, Xieyuanli Chen, Zongtan Zhou, Huimin Lu},
  journal={arXiv preprint arXiv:2503.11409},
  year={2025}
}

2. 🇨🇳Lunar Exploration Simulation System (LESS)

2.1 The Overall of Lunar Exploration Simulator System

The LESS system integrates a high-fidelity lunar terrain model, a customizable rover platform, and a multi-modal sensor suite, while also supporting the Robot Operating System (ROS) to enable realistic data generation and the validation of autonomous perception algorithms for the rover.LESS provides a scalable platform for developing and validating perception algorithms in extraterrestrial environments. This open-source framework is designed for high extensibility, allowing researchers to integrate additional sensors or customize terrain models according to the specific requirements of their applications.

Alt text

2.2 Application Examples of LESS

You can collect multimodal data based on your needs in the LESS system.

Alt text

2.3 Installation

To install the LESS on your workstation and learn more about the system, please refer to the LESS_install.

3.💡LuSeg

3.1 LuSeg Overview

LuSeg is a novel two-stage training segmentation method that effectively maintains the semantic consistency of multimodal features via our proposed Contrast-Driven Fusion module. Stage I involves single-modal training using only RGB images as input, while Stage II focuses on multi-modal training with both RGB and depth images as input. In Stage II, the output of the depth encoder is aligned with the output of the RGB encoder from Stage I, whose parameters are frozen during this stage. This serves as input to our proposed Contrast-Driven Fusion Module (CDFM). The final output of Stage II is the result of our LuSeg.

Alt text

3.2 Dataset

The LunarSeg dataset is a dataset of lunar obstacles, including both positive and negative instances. The dataset is collected using the LESS system and is available for download at Google Drive.

3.3 Pre-trained weights

The pre-trained weights of LuSeg can be downloaded from here.

3.4 Training and Evaluation

We train and evaluate our code in Python 3.7, CUDA 12.1, Pytorch 2.3.1

Train

You can download our pretrained weights to reproduce our results, and you also can train the LuSeg model on the LunarSeg dataset by running the following command:

#Stage I
python train_RGB.py --data_dir /your/path/to/LunarSeg/ --batch_size 4 --gpu_ids 0

#Stage II
python train_TS.py --data_dir /your/path/to/LunarSeg/ --batch_size 4 --gpu_ids 0 --rgb_dir /your/path/to/LunarSeg/StageI/trained_rgb/weight/

Evaluation

You can evaluate the LuSeg model on the LunarSeg dataset by running the following command:

python run_demo_lunar.py --data_dir /your/path/to/LunarSeg/test/ --batch_size 2 --gpu_ids 0 --rgb_dir /your/path/to/LunarSeg/StageI/trained_rgb/weight/ --model_dir /your/path/to/LunarSeg/StageII/trained_ts/weight/

Acknowledgement

We would like to express our sincere gratitude for the following open-source work that has been immensely helpful in the development of LuSeg.

  • InconSeg InconSeg: Residual-Guided Fusion With Inconsistent Multi-Modal Data for Negative and Positive Road Obstacles Segmentation.

License

This project is free software made available under the MIT License. For details see the LICENSE file.

About

LuSeg: Efficient Negative and Positive Obstacles Segmentation via Contrast-Driven Multi-Modal Feature Fusion on the Lunar

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages