-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question/feature request about amortized inference #57
Comments
Amortized inference would be something that is supported by AdvancedVI.jl, but unfortunately, it is not yet possible to do this without pirating the internals. Supporting amortization is on the agenda of AdvancedVI.jl though. |
Thank you for your reply. Please feel free to close or transfer this issue. |
With the current state of things, I think this would be a simple example in the documentation once #38 is worked out. |
I am not an expert in NNs, but my understanding is that part of the "magic" of BayesFlow is that it uses either a DeepSet or SetTransformer for its summary network. I couldn't find any record of that in AdvancedVI. Does this exist in Julia or is it something you plan to add? |
The magic of Julia is that we don't everything to be implemented in-house in order for things to work. If DeepSets is implemented elsewhere, it could definitely be used here. Though using transformers for general Bayes look quite a bit overkill to me. |
@itsdfish Hi, I am quite interested in this. If you were still interested in too, please reach out directly so that we may discuss things together. See my contact details here: https://michaelgutmann.github.io/contact/ Thanks! |
Hello,
I have been looking for a Julia package to perform amortized Bayesian inference on simulation-based models. In many areas of science, there are models with unknown/intractable likelihood functions. Unfortunately, Bayesian inference with these models have historically been difficult. Somewhat recently there has been progress in this area using a special type of normalizing flow. This method uses a neural network to optimize summary statistics for approximate Bayesian computation. The end result is very accurate amortized inference which can be used for any model, including those without a known likelihood function. Currently, this method is only implemented in a Python package called BayesFlow.
Given Turing's interest in machine learning and Bayesian inference, I was wondering whether there is interest in adding this method to the Turing ecosystem. I think it would add a lot of value to the community.
The text was updated successfully, but these errors were encountered: