Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a review step before publication of routine reports [wip] #286

Open
brianmackenna opened this issue Apr 6, 2022 · 7 comments
Open

Comments

@brianmackenna
Copy link

Best practice on mandating review is in place in place across a lot of Bennett Institute websites. I propose that we introduce a mandatory review stage before publishing new/updated reports for reports.opensafely.org

This may include code review but at this stage I'm most keen on getting a sense check on charts, figures etc before they are made publicly available front and centre.

@brianmackenna brianmackenna changed the title Add a review step before publication of routine reports Add a review step before publication of routine reports [wip] Apr 6, 2022
@HelenCEBM
Copy link

For reference here's our existing guidance on publishing reports https://bennettinstitute-team-manual.pages.dev/products/opensafely-reports/#publishing

@HelenCEBM
Copy link

Should the review be enforced through a technical solution e.g. a second person needs to sign in and approve before (a) first publication and maybe (b) every update? Or e.g. making an issue for review as per output checking? Just a slack discussion, or something else?

@LisaHopcroft
Copy link

LisaHopcroft commented May 3, 2022

The process of review for the vaccine reports could look like:

  1. review of format and static text using dummy data, including checking for:
    • adequate context and introduction (consider whether interpretable for journalists)
    • links to external documentation (e.g., The Green Book for vaccines)
    • any necessary caveats
    • links to other relevant Bennett repos/papers/reports
    • explanatory text for figures/tables (e.g., table footnotes)
    • all necessary breakdowns included (e.g., using ethnicity 16 where possible/appropriate rather than ethnicity 6)
    • statement of parameter values? (e.g., time between doses, though this is in the code)
  2. review of content (data, tables, figures etc)
    • sense checking: e.g., percentages/ratios not greater than 100 or 1
    • cross check with national data (check coverage rates and that ours are ~40% of national numbers):
    • output checking: including low numbers, unapproved outputs
    • code review
    • readability of figures (e.g., accurate axes, readable labels)
    • some elements may need to be revisited once real data are used, e.g.:
      • all necessary breakdowns included (e.g., using ethnicity 16 where possible/appropriate rather than ethnicity 6)
      • explanatory text for figures/tables (e.g., table footnotes)

Step (1) certainly needs manual review - clinical informatician + data scientist?
Some of Step (2) could be handled automatically (e.g., by tests and automated output checking).

@LisaHopcroft
Copy link

A technical solution sounds good to me.

If we want an explicit record of checking each pre-defined element (like those provided above), could we use pull request templates, for the first release?

@brianmackenna
Copy link
Author

@LisaHopcroft
Copy link

The document has been fleshed out with further description of checks and processes.

What is missing so far is how (if?) we're going to record this process. I know that we want to make this light touch, but perhaps we could have an issue for each publication with a list of things to tick off? The issue could be generated using a template so that we don't forget anything.

@LisaHopcroft
Copy link

To keep this up to date, we now have three documents for:

And four issue templates that capture a checklist for:

Users are informed in the documentation and in the checklist preamble that they must provide a completed checklist before final sign-off is granted.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants