Preregistering Systematic Reviews

October 6, 2023

This is a Mastodon thread. The original thread is available here:


The article about the generalized preregistration form for systematic reviews is finally out (see doi.org/kt87)! 🥳

It was published on the 22nd of September 2023 - and the process start culminated in that preregistration form and article started on the 9th of February in 2018, so over five years ago 😶


It started when I supervised the Bachelor’s thesis of Tim Vincken. He was doing a systematic review about the home advantage: the idea that soccer teams have an advantage when they play at home versus in their adversary’s city.

We wanted to preregister this systematic review, but we found out that no preregistration form or service existed that we could use 😱

A GIF of a penguin balancing on a soccer ball, but 'tackled' when another penguin comes sliding along.


That was of course not acceptable, so I reached out on Twitter.*

A screenshot from Twitter showing a post asking about registration forms.


That yielded some responses, but not many - until it was retweeted by James Coyne, a much bigger account.

This prompted more discussion, and at some point Nieky van Veggel chimed in. He is a senior lecturer in Animal Health; quite a different field, but he had run into the same problem. He got Brian Nosek’s attention, and he put us in touch with Matt Spitzer.

That initiated the development of the form that just got published.

A screenshot from Twitter showing a retweet and an ensuing dialogue as described in the toot.


We then gathered a large group of people with expertise in doing systematic reviews, and set out to develop a (pre)registration form that was as inclusive as possible (both to disciplines as well as systematic review types).

In this process, Olmo van den Akker (https://www.ovdakker.com/) became the de facto leader, and he expertly guided the project to the finish line.

A GIF of a person saying 'we love every single one of them'.


The result is a form that is relatively elaborate, but with items formulated such that they should fit for reviews in fields such as chemistry, law, and psychology, and review types such as scoping reviews, qualitative systematic reviews, and meta-analyses.

The form has six sections:

1️⃣ Metadata; 2️⃣ Methods; 3️⃣ Search Strategy; 4️⃣ Screening; 5️⃣ Extraction; and 6️⃣ Synthesis and Quality Assessment.


The Metadata section just contains, well, metadata.

The Methods section describes the general methodological constraints of the review: the review type, stages, start and end date, background, research questions, hypotheses and/or expectations, software, funding, conflicts of interest, and how to deal with overlapping authorships.


The form only uses open fields, i.e. no ‘categorical choices’. We figured that if we want to be inclusive, we shouldn’t impose whatever we happen to think is an exhaustive list of options.

In the Search Strategy section, you list the databases and interfaces, how you plan to deal with grey literature, your query, in- and exclusion criteria, search strategy validation, how you’ll deal with contacting authors (to get missing info), and search expiration and search repetition planning.

A GIF of a person searching for something.


The Screening section has 10 items, covering things like screening stages, screened fields and masking, exclusion criteria, instructions, reliability and reconciliation, and sampling.

The idea is that as much as possible, if you complete these items comprehensively, this should be all that screeners need to start screening.

A GIF of a person typing on a keyboard with their head seemingly in a computer screen.


After screening, it’s time to extract stuff! In the Extraction section, you specify which entities you extract, in which stages, with which instructions, as well as whether extractors will be masked and how you deal with reliability and reconciliation.

In my experience developing good definitions of the entities to be extracted is one of the hardest and most time-consuming parts of doing a systematic review - so this section may be one of the hardest to complete.


And then, finally, you specify your plans for Synthesis and Quality Assessment.

Here, you describe your planned data transformations, your missing data plans, your data validation plans, quality assessment plans, and of course your synthesis plans, as well as your inference criteria (if you have any), synthesist masking, reliability, and reconciliation, and finally, planned publication bias analyses and sensitivity analyses.


And that’s it. Ths whole form has 65 items. Yes, 65. That’s a feature, not a bug. The form was designed to enable pretty comprehensive documentation of your plans.

And as a pleasant side-effect, that means the form lends itself well to supporting the planning of your systematic review.

For example, it prompts you to think about whether you want to mask screeners from some fields (common), and/or extractors (rare), and/or synthesists (almost never done).


The form is, however, designed to be applicable to all disciplines and review types. More specialized forms will generally contain items that are more pertinent to your discipline or review type. However, for many combinations, those may not exist yet.

One of the nice things about this form is that it’s available in the {preregr} R package (see https://preregr.opens.science).

That means you can easily adjust it - or create an R Markdown template to fill out.


You can do that with:

preregr::form_to_rmd_template(‘genSysRev_v1’, file = ‘C:/path/to/file.Rmd’);

That loads the form and writes it to the file you specified. You can put that in a Git repository (for example) and then complete it. Once you and all co-authors agree on the contents, you can move the contents to the form on the OSF, if you want.


Or, alternatively, you can render the {preregr} form to an HTML file. It will then also include both the form specification and the form’s contents as machine-readable embedded JSON.

Or both, of course.

The Center for Open Science posted a blog post about the form at https://www.cos.io/blog/generalized-systematic-review-template-joins-osf-registries, and I recorded a talk about the form which is available at https://youtu.be/qB3n9u4VnY0?feature=shared.


And of course, the form itself is also always available at doi.org/kt87.

I hope this thread was useful!

If you plan to do a systematic review, feel free to get in touch - I’m still working on {metabefor} and R package to help with Modular, Extensible, Transparent, Accessible, Bootstrapped Extraction For Systematic Reviews (i.e. METABEFor), and recently created the Extraction Validation App, EVA: https://opens.science/apps/eva. Not documented well yet, though - so as I said, get in touch 🙂