FisherRF: Active View Selection and Uncertainty Quantification for Radiance Fields using Fisher Informations
Wen Jiang, Boshu Lei, Kostas Daniilidis
| Project page | arxiv | full paper
This repo is heavely build on 3D Gaussian Splatting. Please follow their repo to set-up and compile most things about 3D Gaussian Splatting. Beside, this code was tested with the following dependencies:
einops==0.7.0
Pillow==10.2.0
pymeshlab==2023.12.post1
scipy==1.12.0
wandb==0.15.12
git clone [email protected]:JiangWenPL/FisherRF.git --recursive
pip install submodules/diff-gaussian-rasterization/
pip install submodules/simple-knn/
pip install -e ./diff/ -v
Please get the NeRF-synthetic from: https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1 (nerf_synthetic.zip). The MipNeRF360 scenes are hosted by the paper authors here
Please use scripts under ./scripts/
to run different experiments with different configurations. The first arguments of the script is the path to the scene that contains trainsforms_*.json
and the second is the path which you would like to save your experiment. For example:
bash scripts/blender_seq1.sh /PATH/TO/YOUR/DATASET/lego YOUR_EXP_PATH
Obtain LF dataset here. You may need to preprocess the LF dataset following the direction in the original 3D-GS.
Example: statue
# train
bash scripts/lf_cfnerf.sh /mnt/kostas-graid/datasets/wen/LF/ test-lf-statue statue
# render uncertainty
python render_uncertainty.py -m test-lf-statue --override_idxs statue
# calculate ause
python ause.py gaussian statue -m test-lf-statue --data_dir /path/to/LF/dataset
The active mapping part of our method is in a seperate repo. Please checkout the README for setup.
If you find this code useful for your research or the use data generated by our method, please consider citing our paper:
@article{Jiang2023FisherRF,
title={FisherRF: Active View Selection and Uncertainty Quantification for Radiance Fields using Fisher Information},
author={Wen Jiang and Boshu Lei and Kostas Daniilidis},
journal={arXiv},
year={2023}
}
This project builds heavily on 3D Gaussian Splatting and Plenoxels. We thanks the authors for their excellent works! If you use our code, please also consider citing their papers as well.