Skip to content

Julia package for automated Bayesian inference on a factor graph with reactive message passing

License

Notifications You must be signed in to change notification settings

alstat/RxInfer.jl

 
 

Repository files navigation

Official page Stable Dev Examples Build Status Coverage PkgEval

Overview

RxInfer.jl is a Julia package for automatic Bayesian inference on a factor graph with reactive message passing.

Given a probabilistic model, RxInfer allows for an efficient message-passing based Bayesian inference. It uses the model structure to generate an algorithm that consists of a sequence of local computations on a Forney-style factor graph (FFG) representation of the model.

Performance and scalability

RxInfer.jl has been designed with a focus on efficiency, scalability and maximum performance for running Bayesian inference with message passing. Below is a comparison between RxInfer.jl and Turing.jl on latent state estimation in a linear multi-variate Gaussian state-space model. Turing.jl is a state-of-the-art Julia-based general-purpose probabilistic programming package. Still, RxInfer.jl executes the state inference task faster and more accurately. RxInfer.jl accomplishes this by taking advantage of any conjugate likelihood-prior pairings in the model, which have analytical posteriors that are known by RxInfer.jl. As a result, in models with conjugate pairings, RxInfer.jl often beats general-purpose probabilistic programming packages in terms of computational load, speed, memory and accuracy. Note, however, that RxInfer.jl also supports non-conjugate inference.

Turing comparison Scalability performance

Faster inference with better results

RxInfer.jl not only beats generic-purpose Bayesian inference methods, executes faster, and scales better, but also provides more accurate results for various complex problems. Check out our examples!

Inference with RxInfer Inference with HMC

The benchmark and accuracy experiment, which generated these plots, is available in the benchmarks/ folder.

Installation

Install RxInfer through the Julia package manager:

] add RxInfer

Optionally, use ] test RxInfer to validate the installation by running the test suite.

Getting Started

There are examples available to get you started in the examples/ folder. Alternatively, preview the same examples in the documentation.

Coin flip simulation

Here we show a simple example of how to use RxInfer.jl for Bayesian inference problems. In this example we want to estimate a bias of a coin in a form of a probability distribution in a coin flip simulation.

Let's start by creating some dataset. For simplicity in this example we will use static pre-generated dataset. Each sample can be thought of as the outcome of single flip which is either heads or tails (1 or 0). We will assume that our virtual coin is biased, and lands heads up on 75% of the trials (on average).

First let's setup our environment by importing all needed packages:

using RxInfer, Random

Next, let's define our dataset:

n = 500  # Number of coin flips
p = 0.75 # Bias of a coin

distribution = Bernoulli(p) 
dataset      = float.(rand(Bernoulli(p), n))

Model specification

In a Bayesian setting, the next step is to specify our probabilistic model. This amounts to specifying the joint probability of the random variables of the system.

Likelihood

We will assume that the outcome of each coin flip is governed by the Bernoulli distribution, i.e.

where represents "heads", represents "tails". The underlying probability of the coin landing heads up for a single coin flip is .

Prior

We will choose the conjugate prior of the Bernoulli likelihood function defined above, namely the beta distribution, i.e.

where a and b are the hyperparameters that encode our prior beliefs about the possible values of θ. We will assign values to the hyperparameters in a later step.

Joint probability

The joint probability is given by the multiplication of the likelihood and the prior, i.e.

Now let's see how to specify this model using GraphPPL's package syntax.

# GraphPPL.jl export `@model` macro for model specification
# It accepts a regular Julia function and builds an FFG under the hood
@model function coin_model(n)

    # `datavar` creates data 'inputs' in our model
    # We will pass data later on to these inputs
    # In this example we create a sequence of inputs that accepts Float64
    y = datavar(Float64, n)
    
    # We endow θ parameter of our model with some prior
    θ ~ Beta(2.0, 7.0)
    
    # We assume that outcome of each coin flip 
    # is governed by the Bernoulli distribution
    for i in 1:n
        y[i] ~ Bernoulli(θ)
    end
    
end

As you can see, RxInfer offers a model specification syntax that resembles closely to the mathematical equations defined above. We use datavar function to create "clamped" variables that take specific values at a later date. θ ~ Beta(2.0, 7.0) expression creates random variable θ and assigns it as an output of Beta node in the corresponding FFG.

Inference specification

Once we have defined our model, the next step is to use RxInfer API to infer quantities of interests. To do this we can use a generic inference function from RxInfer.jl that supports static datasets.

result = inference(
    model = coin_model(length(dataset)),
    data  = (y = dataset, )
)

Coin Flip

Where to go next?

There are a set of examples available in RxInfer repository that demonstrate the more advanced features of the package. Alternatively, you can head to the [documentation][docs-stable-url] that provides more detailed information of how to use RxInfer to specify more complex probabilistic models.

Ecosystem

The RxInfer framework consists of three core packages developed by BIASlab:

  • ReactiveMP.jl - the underlying message passing-based inference engine
  • GraphPPL.jl - model and constraints specification package
  • Rocket.jl - reactive extensions package for Julia

License

MIT License Copyright (c) 2021-2023 BIASlab

About

Julia package for automated Bayesian inference on a factor graph with reactive message passing

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 67.2%
  • Julia 32.5%
  • Makefile 0.3%