Skip to content

slimgroup/InvertibleNetworks.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

daa2ea9 · Oct 28, 2024
Mar 26, 2024
May 10, 2024
May 28, 2024
Jul 29, 2024
Aug 2, 2024
Oct 22, 2020
Jul 30, 2024
Jul 30, 2024
May 30, 2024
Feb 7, 2020
Oct 28, 2024
Jul 30, 2024
Feb 24, 2021

Repository files navigation

InvertibleNetworks.jl

Documentation Build Status JOSS paper
CI DOI

Building blocks for invertible neural networks in the Julia programming language.

  • Memory efficient building blocks for invertible neural networks
  • Hand-derived gradients, Jacobians J , and log | J |
  • Flux integration
  • Support for Zygote and ChainRules
  • GPU support
  • Includes various examples of invertible neural networks, normalizing flows, variational inference, and uncertainty quantification

Installation

InvertibleNetworks is registered and can be added like any standard Julia package with the command:

] add InvertibleNetworks

Uncertainty-aware image reconstruction

Due to its memory scaling InvertibleNetworks.jl, has been particularily successful at Bayesian posterior sampling with simulation-based inference. To get started with this application refer to a simple example (Conditional sampling for MNSIT inpainting) but feel free to modify this script for your application and please reach out to us for help.

mnist_sampling_cond

Building blocks

  • 1x1 Convolutions using Householder transformations (example)

  • Residual block (example)

  • Invertible coupling layer from Dinh et al. (2017) (example)

  • Invertible hyperbolic layer from Lensink et al. (2019) (example)

  • Invertible coupling layer from Putzky and Welling (2019) (example)

  • Invertible recursive coupling layer HINT from Kruse et al. (2020) (example)

  • Activation normalization (Kingma and Dhariwal, 2018) (example)

  • Various activation functions (Sigmoid, ReLU, leaky ReLU, GaLU)

  • Objective and misfit functions (mean squared error, log-likelihood)

  • Dimensionality manipulation: squeeze/unsqueeze (column, patch, checkerboard), split/cat

  • Squeeze/unsqueeze using the wavelet transform

Examples

  • Invertible recurrent inference machines (Putzky and Welling, 2019) (generic example)

  • Generative models with maximum likelihood via the change of variable formula (example)

  • Glow: Generative flow with invertible 1x1 convolutions (Kingma and Dhariwal, 2018) (generic example, source)

GPU support

GPU support is supported via Flux/CuArray. To use the GPU, move the input and the network layer to GPU via |> gpu

using InvertibleNetworks, Flux

# Input
nx = 64
ny = 64
k = 10
batchsize = 4

# Input image: nx x ny x k x batchsize
X = randn(Float32, nx, ny, k, batchsize) |> gpu

# Activation normalization
AN = ActNorm(k; logdet=true) |> gpu

# Test invertibility
Y_, logdet = AN.forward(X)

Reference

If you use InvertibleNetworks.jl in your research, we would be grateful if you cite us with the following bibtex:

@article{Orozco2024, doi = {10.21105/joss.06554}, url = {https://doi.org/10.21105/joss.06554}, year = {2024}, publisher = {The Open Journal}, volume = {9}, number = {99}, pages = {6554}, author = {Rafael Orozco and Philipp Witte and Mathias Louboutin and Ali Siahkoohi and Gabrio Rizzuti and Bas Peters and Felix J. Herrmann}, title = {InvertibleNetworks.jl: A Julia package for scalable normalizing flows}, journal = {Journal of Open Source Software} }

Papers

The following publications use InvertibleNetworks.jl:

Contributing

We welcome contributions and bug reports! Please see CONTRIBUTING.md for guidance.

InvertibleNetworks.jl development subscribes to the Julia Community Standards.

Authors

  • Rafael Orozco, Georgia Institute of Technology [rorozco@gatech.edu]

  • Philipp Witte, Georgia Institute of Technology (now Microsoft)

  • Gabrio Rizzuti, Utrecht University

  • Mathias Louboutin, Georgia Institute of Technology

  • Ali Siahkoohi, Georgia Institute of Technology

Acknowledgments

This package uses functions from NNlib.jl, Flux.jl and Wavelets.jl