distreqx (pronounced "dist-rex") is a JAX-based library providing implementations of distributions, bijectors, and tools for statistical and probabilistic machine learning with all benefits of jax (native GPU/TPU acceleration, differentiability, vectorization, distributing workloads, XLA compilation, etc.).
The origin of this package is a reimplementation of distrax, (which is a subset of TensorFlow Probability (TFP), with some new features and emphasis on jax compatibility) using equinox. As a result, much of the original code/comments/documentation/tests are directly taken or adapted from distrax (original distrax copyright available at end of README.)
Current features include:
- Probability distributions
- Bijectors
pip install distreqx
or
git clone https://github.com/lockwo/distreqx.git
cd distreqx
pip install -e .
Requires Python 3.9+, JAX 0.4.11+, and Equinox 0.11.0+.
Available at https://lockwo.github.io/distreqx/.
import jax
from jax import numpy as jnp
from distreqx import distributions
key = jax.random.PRNGKey(1234)
mu = jnp.array([-1., 0., 1.])
sigma = jnp.array([0.1, 0.2, 0.3])
dist = distributions.MultivariateNormalDiag(mu, sigma)
samples = dist.sample(key)
print(dist.log_prob(samples))
- No official support/interoperability with TFP
- The concept of a batch dimension is dropped. If you want to operate on a batch, use
vmap
(note, this can be used in construction as well, e.g. vmaping the construction of aScalarAffine
) - Broader pytree enablement
- Strict abstract/final design pattern
If you found this library useful in academic research, please cite:
@software{lockwood2024distreqx,
title = {distreqx: Distributions and Bijectors in Jax},
author = {Owen Lockwood},
url = {https://github.com/lockwo/distreqx},
doi = {10.5281/zenodo.13764512},
}
(Also consider starring the project on GitHub.)
GPJax: Gaussian processes in JAX.
flowjax: Normalizing flows in JAX.
Optimistix: root finding, minimisation, fixed points, and least squares.
Lineax: linear solvers.
sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.
diffrax: numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable.
Awesome JAX: a longer list of other JAX projects.
Copyright 2021 DeepMind Technologies Limited. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
==============================================================================