Skip to content

Codes for "Riemannian Stein Variational Gradient Descent for Bayesian Inference" (AAAI-18)

Notifications You must be signed in to change notification settings

zchaoking/Riem-SVGD

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Chang Liu <[email protected]; [email protected]>, and Jun Zhu. AAAI 2018.

[Paper] [Appendix] [Slides] [Poster]

Introduction

The repository implements the proposed methods, Riemannian Stein Variational Inference Descent (RSVGD) in both coordinate space and embedded space, and their application in Bayesian Logistic Regression (BLR) and Spherical Admixture Model (SAM) (Reisinger et al., 2010). The repository also includes implementations of baseline methods: Stein Variational Gradient Descent (SVGD) (Liu & Wang, 2016) based on their codes for the BLR experiment, and Stochastic Gradient Geodesic Monte Carlo (SGGMC) and geodesic Stochastic Gradient Nose-Hoover Thermostats (gSGNHT) (Liu et al., 2016) based on their codes for the SAM experiment.

RSVGD is the first particle-based variational inference method on Riemannian manifolds. For Bayesian inference tasks with Euclidean support space, as is the case for BLR, RSVGD can utilize information geometry implemented in the coordinate space of the Fisher distribution manifold to speed up convergence over SVGD, and for tasks with Riemannian manifold support, especially manifolds with no global coordinate systems like hyperspheres as is the case for SAM, SVGD is not applicable, while RSVGD can tackle the manifold structure efficiently in the embedded space of the manifold. In either task, RSVGD achieves better results than classical parameter-based variational inference methods, and is more iteration- and particle-efficient than MCMC methods.

Instructions

Bayesian Logistic Regression (BLR)

Corresponds to the folder "bayesian_logistic_regression/".

Spherical Admixture Model (SAM)

Corresponds to the folder "SAM/".

  • Codes:

    • RSVGD, GMC and SGGMC are implemented in C++ based on the codes by Liu et al. (2016). The codes employ Eigen for linear algebra and OpenMP for paralellization.
    • Variational inference methods are implemented in MATLAB also based on the codes by Liu et al. (2016).
    • To compile the C++ codes, just type "make" in each subfolder. See the instructions by Liu et al. (2016) for more details.
  • Data:

    We did not contribute to the data set here. The data set used in the experiment is:

Citation

	@inproceedings{liu2018riemannian,
	  title={{R}iemannian {S}tein variational gradient descent for {B}ayesian inference},
	  author={Liu, Chang and Zhu, Jun},
	  booktitle={The 32nd AAAI Conference on Artificial Intelligence},
	  pages={3627--3634},
	  year={2018},
	  organization={AAAI press},
	  address={New Orleans, Louisiana USA},
	  url = {https://aaai.org/ocs/index.php/AAAI/AAAI18/paper/view/17275},
	  keywords = {Bayesian inference; Riemann manifold; Information geometry; kernel methods},
	}

About

Codes for "Riemannian Stein Variational Gradient Descent for Bayesian Inference" (AAAI-18)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 95.1%
  • MATLAB 2.3%
  • C 1.3%
  • Python 1.0%
  • CMake 0.2%
  • Makefile 0.1%