Lihan Jiang*, Kerui Ren*, Mulin Yu, Linning Xu, Junting Dong, Tao Lu, Feng Zhao, Dahua Lin, Bo Dai ✉️
Implementation of Horizon-GS, a novel approach built upon Gaussian Splatting techniques, tackles the unified reconstruction and rendering for aerial and street views. Horizon-GS addresses the key challenges of combining these perspectives with a new training strategy, overcoming viewpoint discrepancies to generate high-fidelity scenes.
- Clone this repo:
git clone https://github.com/city-super/Horizon-GS.git --recursive
cd Horizon-GS
- Install dependencies
SET DISTUTILS_USE_SDK=1 # Windows only
conda env create --file environment.yml
conda activate horizon_gs
Here we use gsplat to unify the rendering process of different Gaussians. Considering the adaptation for 2D-GS, we choose gsplat version which supports 2DGS.
First, create a data/
folder inside the project path by
mkdir data
Next, download the following data, and place them under a desired direcory, e.g. data/:
- The HorizonGS data is available in Hugging Face.
- The UCGS dataset are provided by the paper author here.
- The MatrixCity dataset can be downloaded from Hugging Face/Openxlab/百度网盘[提取码:hqnn].
For training a small scene like Block_small, first generate the config and then run it:
# generate config, we have provided the config for all datasets in the config folder
python preprocess/data_preprocess.py --config config/<dataset>/config.yaml
# train coarse
python train.py --config config/<dataset>/coarse.yaml
# train fine
python train.py --config config/<dataset>/fine.yaml
For training a large scene like Block_A, first generate the config and then run it:
# generate config
python preprocess/data_preprocess.py --config config/<dataset>/config.yaml
# train coarse of each chunk
python train.py --config config/<dataset>/coarse.yaml
# train fine of each chunk
python train.py --config config/<dataset>/fine.yaml
# merge all chunks
python merge.py -m <path to trained model> --config config/<dataset>/config.yaml
We keep the manual rendering function with a similar usage of the counterpart in 3D-GS, one can run it by
python render.py -m <path to trained model> # Generate renderings
python metrics.py -m <path to trained model> # Compute error metrics on renderings
python export_mesh.py -m <path to trained model> # Export mesh
- Lihan Jiang: [email protected]
- Kerui Ren: [email protected]
If you find our work helpful, please consider citing:
@article{jiang2024horizon,
title={Horizon-GS: Unified 3D Gaussian Splatting for Large-Scale Aerial-to-Ground Scenes},
author={Jiang, Lihan and Ren, Kerui and Yu, Mulin and Xu, Linning and Dong, Junting and Lu, Tao and Zhao, Feng and Lin, Dahua and Dai, Bo},
journal={arXiv preprint arXiv:2412.01745},
year={2024}
}
Please follow the LICENSE of 3D-GS.
We thank all authors from the following repositories for their excellent work:
- 3D Gaussian Splatting for Real-Time Radiance Field Rendering
- 2D Gaussian Splatting for Geometrically Accurate Radiance Fields
- Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering
- Octree-GS: Towards Consistent Real-time Rendering with LOD-Structured 3D Gaussians
- VastGaussian: Vast 3D Gaussians for Large Scene Reconstruction
- A Hierarchical 3D Gaussian Representation for Real-Time Rendering of Very Large Datasets
- Drone-assisted Road Gaussian Splatting with Cross-view Uncertainty
- gsplat