Composable Bayesian Modeling with Soss.jl

07/29/2021, 5:00 PM5:10 PM UTC


Soss is a probabilistic programming language (PPL) with first-class composable models. Through dynamic code generation, Soss can achieve speedup of several orders of magnitude in some models, for example using symbolic simplification of the log-density.

In this talk, we'll discuss the goals and design choices in Soss that distinguish it from other PPLs, followed by an overview of upcoming work.


First-Class, Composable Models

Soss models can be used and composed similarly to working with functions. This allows models to be built up from smaller, reusable components. In some cases, these can be developed and tested independently.

Dynamic Code Generation

Soss uses runtime code generation for efficient inference primitives. These are specialized for model and input types. New primitives can easily use arbitrary data structures; the system is very flexible. Models are fully generative and determine joint distributions. In particular, models have rand and logdensity methods like any other measure.

Model Transformations

Internally, models are represented as a directed graph with an AST (a Julia Expr) at each node. This makes it easy to transform one model into another based on its dependencies or AST structures. We can compute Markov blankets or reparameterizations, or change a model to output the latent conditional distributions used along the way.


Soss uses MeasureTheory.jl and allows falling back to Distributions.jl when needed, so it inherits the benefits of MeasureTheory. For example, fewer type constraints on constructors means Soss can evaluate a log-density symbolically. Coupled with codegen, this enables generation of highly optimized code.

Platinum sponsors

Julia Computing

Gold sponsors

Relational AI

Silver sponsors

Invenia LabsConningPumas AIQuEra Computing Inc.King Abdullah University of Science and TechnologyDataChef.coJeffrey Sarnoff

Media partners

Packt Publication

Fiscal Sponsor