TorchCP: A Python toolbox for Conformal Prediction in Deep Learning.
Technical Report
·
Documentation
TorchCP is a Python toolbox for conformal prediction research on deep learning models, built on the PyTorch Library with strong GPU acceleration. In the toolbox, we implement representative methods (including posthoc and training methods) for many tasks of conformal prediction, including: Classification, Regression, Graph Node Classification, and LLM. We build the basic framework of TorchCP based on AdverTorch
. This codebase is still under construction and maintained by Hongxin Wei
's research group at SUSTech.
Comments, issues, contributions, and collaborations are all welcomed!
- Added new score functions and training methods for classification, including KNN, TOPK, C-Adapter, and ConfTS.
- Introduced CP algorithms for graph node classification, such as DAPS, SNAPS, and NAPS.
- Added new conformal algorithms for regression, including CQRFM, CQRR, CQRM, and Ensemble CP.
- Introduced CP algorithms for LLMs.
- Added unit-test and examples.
- Optimized the form of prediction sets to improve the computational efficiency.
- Refactored the module design of Regression to improve the scalability.
TorchCP has implemented the following methods:
Year | Title | Venue | Code Link | Implementation | Remark |
---|---|---|---|---|---|
2023 | Conformal Prediction via Regression-as-Classification | RegML @ NeurIPS 2023 | link | regression.score.r2ccp | |
2021 | Adaptive Conformal Inference Under Distribution Shift | NeurIPS'21 | Link | regression.predictor.aci | support time series |
2021 | Adaptive Conformal Inference Under Distribution Shift | NeurIPS'21 | Link | regression.predictor.aci | support time series |
2020 | A comparison of some conformal quantile regression methods | Stat | Link | regression.score.cqm regression.score.cqrr | |
2020 | Conformal Prediction Interval for Dynamic Time-Series | ICML'21 | Link | regression.predictor.ensemble | support time series |
2019 | Adaptive, Distribution-Free Prediction Intervals for Deep Networks | AISTATS'19 | Link | regression.score.cqrfm | |
2019 | Conformalized Quantile Regression | NeurIPS'19 | Link | regression.score.cqr | |
2017 | Distribution-Free Predictive Inference For Regression | JASA | Link | regression.predictor.split |
Year | Title | Venue | Code Link | Implementation |
---|---|---|---|---|
2024 | Similarity-Navigated Conformal Prediction for Graph Neural Networks | NeuIPS'24 | Link | graph.score.snaps |
2023 | Distribution Free Prediction Sets for Node Classification | ICML'23 | Link | graph.predictor.neighbors_weight |
2023 | Conformal Prediction Sets for Graph Neural Networks | ICML'23 | Link | graph.score.daps |
2023 | Uncertainty Quantification over Graph with Conformalized Graph Neural Networks | NeurIPS'23 | Link | graph.trainer.cfgnn |
Year | Title | Venue | Code Link | Implementation |
---|---|---|---|---|
2023 | Conformal Language Modeling | ICLR'24 | Link | llm.predictor.conformal_llm |
TorchCP is still under active development. We will add the following features/items down the road:
Year | Title | Venue | Code |
---|---|---|---|
2022 | Training Uncertainty-Aware Classifiers with Conformalized Deep Learning | NeurIPS'22 | Link |
2022 | Adaptive Conformal Predictions for Time Series | ICML'22 | Link |
2022 | Conformal Prediction Sets with Limited False Positives | ICML'22 | Link |
2021 | Optimized conformal classification using gradient descent approximation | Arxiv |
TorchCP is developed with Python 3.9 and PyTorch 2.0.1. To install TorchCP, simply run
pip install torchcp
To install from TestPyPI server, run
pip install --index-url https://test.pypi.org/simple/ --no-deps torchcp
Here, we provide a simple example for a classification task, with THR score and SplitPredictor.
from torchcp.classification.score import THR
from torchcp.classification.predictor import SplitPredictor
# Preparing a calibration data and a test data.
cal_dataloader = ...
test_dataloader = ...
# Preparing a pytorch model
model = ...
model.eval()
# Options of score function: THR, APS, SAPS, RAPS
# Define a conformal prediction algorithm. Optional: SplitPredictor, ClusteredPredictor, ClassWisePredictor
predictor = SplitPredictor(score_function=THR(), model=model)
# Calibrating the predictor with significance level as 0.1
predictor.calibrate(cal_dataloader, alpha=0.1)
#########################################
# Predicting for test instances
########################################
test_instances = ...
predict_sets = predictor.predict(test_instances)
print(predict_sets)
#########################################
# Evaluating the coverage rate and average set size on a given dataset.
########################################
result_dict = predictor.evaluate(test_dataloader)
print(result_dict["Coverage_rate"], result_dict["Average_size"])
You may find more tutorials in examples
folder.
This project is licensed under the LGPL. The terms and conditions can be found in the LICENSE and LICENSE.GPL files.
If you find our repository useful for your research, please consider citing the following technical report:
@misc{wei2024torchcp,
title={TorchCP: A Library for Conformal Prediction based on PyTorch},
author={Hongxin Wei and Jianguo Huang},
year={2024},
eprint={2402.12683},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
We welcome you to cite the following works:
@inproceedings{huangconformal,
title={Conformal Prediction for Deep Classifier via Label Ranking},
author={Huang, Jianguo and Xi, HuaJun and Zhang, Linjun and Yao, Huaxiu and Qiu, Yue and Wei, Hongxin},
booktitle={Forty-first International Conference on Machine Learning}
}
@article{xi2024does,
title={Does Confidence Calibration Help Conformal Prediction?},
author={Xi, Huajun and Huang, Jianguo and Feng, Lei and Wei, Hongxin},
journal={arXiv preprint arXiv:2402.04344},
year={2024}
}
@article{liu2024c,
title={C-Adapter: Adapting Deep Classifiers for Efficient Conformal Prediction Sets},
author={Liu, Kangdao and Zeng, Hao and Huang, Jianguo and Zhuang, Huiping and Vong, Chi-Man and Wei, Hongxin},
journal={arXiv preprint arXiv:2410.09408},
year={2024}
}