-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model checks against simulation #7
Comments
For a paper one idea could be to simulate from models with the same parameters as obtained in the real-data study (assuming we cover different countries). Not entirely sure how to do that, though (I assume fixing parameters at their posterior means?). |
Yes good idea. I am looking at some simplistic simulation based calibration as a first step in. |
In my understanding SBC is the more general approach - if doing that I don't think you need to bother with fixed parameters. If you're interested in particular parts of the parameter space you could subset to that. |
Not entirely sure what you mean here - what's SBC? (I found at least "Simulation-Based Calibration" and "Schwarz Bayesian Criterion" as vaguely plausible options ;) ). |
Yes I agree it is but there is an argument for testing performance explicitly under certain scenarios. That being said not planning on doing this here I think. @jbracher : Yes that is the one. Quite a few resources out here but a few here: https://mc-stan.org/docs/2_23/stan-users-guide/simulation-based-calibration.html Going to close this in favour of #3 |
Check the model works against some basic simulations generated in R.
Implement by:
generative_model()
function or similar for generating dataThe text was updated successfully, but these errors were encountered: