Chongjie Ye*, Lingteng Qiu*, Xiaodong Gu, Qi Zuo, Yushuang Wu, Zilong Dong, Liefeng Bo, Yuliang Xiu#, Xiaoguang Han#
* Equal contribution
# Corresponding Author
We propose StableNormal, which tailors the diffusion priors for monocular normal estimation. Unlike prior diffusion-based works, we focus on enhancing estimation stability by reducing the inherent stochasticity of diffusion models ( i.e. , Stable Diffusion). This enables “Stable-and-Sharp” normal estimation, which outperforms multiple baselines (try Compare), and improves various real-world applications (try Demo).
- StableNormal YOSO is now avaliable on ModelScope . We invite you to explore its features! 🔥🔥🔥 (10.11, 2024 UTC)
- StableNormal is got accepted by SIGGRAPH Asia 2024. (Journal Track)) (09.11, 2024 UTC)
- Release StableDelight 🔥🔥🔥 (09.07, 2024 UTC)
- Release StableNormal 🔥🔥🔥 (08.27, 2024 UTC)
We're excited to announce the release of StableDelight, our latest open-source project focusing on real-time reflection removal from textured surfaces. Check out the StableDelight for more details!
Please run following commands to build package:
git clone https://github.com/Stable-X/StableNormal.git
cd StableNormal
pip install -r requirements.txt
or directly build package:
pip install git+https://github.com/Stable-X/StableNormal.git
To use the StableNormal pipeline, you can instantiate the model and apply it to an image as follows:
import torch
from PIL import Image
# Load an image
input_image = Image.open("path/to/your/image.jpg")
# Create predictor instance
predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal", trust_repo=True)
# Apply the model to the image
normal_image = predictor(input_image)
# Save or display the result
normal_image.save("output/normal_map.png")
Additional Options:
- If you need faster inference(10 times faster), use
StableNormal_turbo
:
predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal_turbo", trust_repo=True)
- If Hugging Face is not available from terminal, you could download the pretrained weights to
weights
dir:
predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal", trust_repo=True, local_cache_dir='./weights')
Compute Metrics:
This section provides guidance on evaluating your normal predictor using the DIODE dataset.
Step 1: Prepare Your Results Folder
First, make sure you have generated a normal map and structured your results folder as shown below:
├── YOUR-FOLDER-NAME
│ ├── scan_00183_00019_00183_indoors_000_010_gt.png
│ ├── scan_00183_00019_00183_indoors_000_010_init.png
│ ├── scan_00183_00019_00183_indoors_000_010_ref.png
│ ├── scan_00183_00019_00183_indoors_000_010_step0.png
│ ├── scan_00183_00019_00183_indoors_000_010_step1.png
│ ├── scan_00183_00019_00183_indoors_000_010_step2.png
│ ├── scan_00183_00019_00183_indoors_000_010_step3.png
Step 2: Compute Metric Values
Once your results folder is set up, you can compute the metrics for your normal predictions by running the following scripts:
# compute metrics
python ./stablenormal/metrics/compute_metric.py -i ${YOUR-FOLDER-NAME}
# compute variance
python ./stablenormal/metrics/compute_variance.py -i ${YOUR-FOLDER-NAME}
Replace ${YOUR-FOLDER-NAME}; with the actual name of your results folder. Following these steps will allow you to effectively evaluate your normal predictor's performance on the DIODE dataset.
@article{ye2024stablenormal,
title={StableNormal: Reducing Diffusion Variance for Stable and Sharp Normal},
author={Ye, Chongjie and Qiu, Lingteng and Gu, Xiaodong and Zuo, Qi and Wu, Yushuang and Dong, Zilong and Bo, Liefeng and Xiu, Yuliang and Han, Xiaoguang},
journal={ACM Transactions on Graphics (TOG)},
year={2024},
publisher={ACM New York, NY, USA}
}