Skip to content

This repo makes DepthPro trainable on Tartanair dataset

License

Notifications You must be signed in to change notification settings

mvish7/ml-depth-pro-knowledge-distil

 
 

Repository files navigation

Tiny Depth Pro: Using Depth Pro in Knowledge Distilation setting to train a Tiny Depth pro

Getting Started

We recommend setting up a virtual environment. Using e.g. miniconda, the depth_pro package can be installed via:

conda create -n depth-pro -y python=3.9
conda activate depth-pro

pip install -e .

To download pretrained checkpoints follow the code snippet below:

source get_pretrained_models.sh   # Files will be downloaded to `checkpoints` directory.

Running from commandline

We provide a helper script to directly run the model on a single image:

# Run prediction on a single image:
depth-pro-run -i ./data/example.jpg
# Run `depth-pro-run -h` for available options.

Running from python

from PIL import Image
import depth_pro

# Load model and preprocessing transform
model, transform = depth_pro.create_model_and_transforms()
model.eval()

# Load and preprocess an image.
image, _, f_px = depth_pro.load_rgb(image_path)
image = transform(image)

# Run inference.
prediction = model.infer(image, f_px=f_px)
depth = prediction["depth"]  # Depth in [m].
focallength_px = prediction["focallength_px"]  # Focal length in pixels.

Evaluation (boundary metrics)

Our boundary metrics can be found under eval/boundary_metrics.py and used as follows:

# for a depth-based dataset
boundary_f1 = SI_boundary_F1(predicted_depth, target_depth)

# for a mask-based dataset (image matting / segmentation) 
boundary_recall = SI_boundary_Recall(predicted_depth, target_mask)

Citation

If you find our work useful, please cite the following paper:

@article{Bochkovskii2024:arxiv,
  author     = {Aleksei Bochkovskii and Ama\"{e}l Delaunoy and Hugo Germain and Marcel Santos and
               Yichao Zhou and Stephan R. Richter and Vladlen Koltun}
  title      = {Depth Pro: Sharp Monocular Metric Depth in Less Than a Second},
  journal    = {arXiv},
  year       = {2024},
  url        = {https://arxiv.org/abs/2410.02073},
}

License

This sample code is released under the LICENSE terms.

The model weights are released under the LICENSE terms.

Acknowledgements

Our codebase is built using multiple opensource contributions, please see Acknowledgements for more details.

Please check the paper for a complete list of references and datasets used in this work.

About

This repo makes DepthPro trainable on Tartanair dataset

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.6%
  • Shell 0.4%