Chang Liu <[email protected]; [email protected]>,
and Jun Zhu. AAAI 2018.
[Paper] [Appendix] [Slides] [Poster]
The repository implements the proposed methods, Riemannian Stein Variational Inference Descent (RSVGD) in both coordinate space and embedded space, and their application in Bayesian Logistic Regression (BLR) and Spherical Admixture Model (SAM) (Reisinger et al., 2010). The repository also includes implementations of baseline methods: Stein Variational Gradient Descent (SVGD) (Liu & Wang, 2016) based on their codes for the BLR experiment, and Stochastic Gradient Geodesic Monte Carlo (SGGMC) and geodesic Stochastic Gradient Nose-Hoover Thermostats (gSGNHT) (Liu et al., 2016) based on their codes for the SAM experiment.
RSVGD is the first particle-based variational inference method on Riemannian manifolds. For Bayesian inference tasks with Euclidean support space, as is the case for BLR, RSVGD can utilize information geometry implemented in the coordinate space of the Fisher distribution manifold to speed up convergence over SVGD, and for tasks with Riemannian manifold support, especially manifolds with no global coordinate systems like hyperspheres as is the case for SAM, SVGD is not applicable, while RSVGD can tackle the manifold structure efficiently in the embedded space of the manifold. In either task, RSVGD achieves better results than classical parameter-based variational inference methods, and is more iteration- and particle-efficient than MCMC methods.
Corresponds to the folder "bayesian_logistic_regression/".
-
Codes:
Implemented in Python with NumPy based on the codes by Liu & Wang (2016).
-
Data:
We did not contribute to the data sets here. The data sets used in the experiment are:
- "covertype.mat": Covertype dataset from the LIBSVM Data Repository. It is also used in SVGD by Liu & Wang (2016).
- "benchmarks.mat": benchmark datasets compiled by Mika et al. (1999).
Corresponds to the folder "SAM/".
-
Codes:
- RSVGD, GMC and SGGMC are implemented in C++ based on the codes by Liu et al. (2016). The codes employ Eigen for linear algebra and OpenMP for paralellization.
- Variational inference methods are implemented in MATLAB also based on the codes by Liu et al. (2016).
- To compile the C++ codes, just type "make" in each subfolder. See the instructions by Liu et al. (2016) for more details.
-
Data:
We did not contribute to the data set here. The data set used in the experiment is:
- "20News-diff": The dataset is processed and used by Liu et al. (2016). It is a subset of the 20Newsgroups dataset with normalized tf-idf feature. See the dataset website or its "README.md" file for more details.
@inproceedings{liu2018riemannian,
title={{R}iemannian {S}tein variational gradient descent for {B}ayesian inference},
author={Liu, Chang and Zhu, Jun},
booktitle={The 32nd AAAI Conference on Artificial Intelligence},
pages={3627--3634},
year={2018},
organization={AAAI press},
address={New Orleans, Louisiana USA},
url = {https://aaai.org/ocs/index.php/AAAI/AAAI18/paper/view/17275},
keywords = {Bayesian inference; Riemann manifold; Information geometry; kernel methods},
}