[4][5] While a systematic review may be applied in the biomedical or health care context, it may also be used where an assessment of a precisely defined subject can advance understanding in a field of research.
The distinction between the two is that a meta-analysis uses statistical methods to induce a single number from the pooled data set (such as an effect size), whereas the strict definition of a systematic review excludes that step.
A systematic review can be designed to provide a thorough summary of current literature relevant to a research question.
[1] A systematic review uses a rigorous and transparent approach for research synthesis, with the aim of assessing and, where possible, minimizing bias in the findings.
[12][13][14] The EPPI-Centre, Cochrane, and the Joanna Briggs Institute have been influential in developing methods for combining both qualitative and quantitative research in systematic reviews.
[19] A list of PRISMA guideline extensions is hosted by the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network.
[25][26] This can mean that the concept search and method (including data extraction, organisation and analysis) are refined throughout the process, sometimes requiring deviations from any protocol or original research plan.
Scoping reviews are helpful when it is not possible to carry out a systematic synthesis of research findings, for example, when there are no published clinical trials in the area of inquiry.
[38][39] Clinical reviews of quantitative data are often structured using the mnemonic PICO, which stands for 'Population or Problem', 'Intervention or Exposure', 'Comparison', and 'Outcome', with other variations existing for other kinds of research.
Inviting and involving an experienced information professional or librarian can improve the quality of systematic review search strategies and reporting.
[38] This can include assessing if a data source meets the eligibility criteria and recording why decisions about inclusion or exclusion in the review were made.
When appropriate, some systematic reviews include a meta-analysis, which uses statistical methods to combine data from multiple sources.
[38] The logo is a forest plot of one of the first reviews which showed that corticosteroids given to women who are about to give birth prematurely can save the life of the newborn child.
[54] Some users do not have time to invest in reading large and complex documents and/or may lack awareness or be unable to access newly published research.
Researchers are, therefore, developing skills to use creative communication methods such as illustrations, blogs, infographics, and board games to share the findings of systematic reviews.
Living systematic reviews are "dynamic, persistent, online-only evidence summaries, which are updated rapidly and frequently".
Most notable among international organisations is Cochrane, a group of over 37,000 specialists in healthcare who systematically review randomised trials of the effects of prevention, treatments, and rehabilitation as well as health systems interventions.
The 2015 impact factor for The Cochrane Database of Systematic Reviews was 6.103, and it was ranked 12th in the Medicine, General & Internal category.
[71] Standardised Data on Initiatives (STARDIT) is another proposed way of reporting who has been involved in which tasks during research, including systematic reviews.
[79] Uptake has since been rapid, with the estimated number of systematic reviews in the field doubling since 2016 and the first consensus recommendations on best practice, as a precursor to a more general standard, being published in 2020.
[81][82] Several organisations use systematic reviews in social, behavioural, and educational areas of evidence-based policy, including the National Institute for Health and Care Excellence (NICE, UK), Social Care Institute for Excellence (SCIE, UK), the Agency for Healthcare Research and Quality (AHRQ, US), the World Health Organization, the International Initiative for Impact Evaluation (3ie), the Joanna Briggs Institute, and the Campbell Collaboration.
The top six software tools (with at least 21/30 key features) are all proprietary paid platforms, typically web-based, and include:[89] The Cochrane Collaboration provides a handbook for systematic reviewers of interventions which "provides guidance to authors for the preparation of Cochrane Intervention reviews.
[92] A 2003 study suggested that extending searches beyond major databases, perhaps into grey literature, would increase the effectiveness of reviews.
They proposed several solutions, including limiting studies in meta-analyses and reviews to registered clinical trials, requiring that original data be made available for statistical checking, paying greater attention to sample size estimates, and eliminating dependence on only published data.
[106] The rapid growth of systematic reviews in recent years has been accompanied by the attendant issue of poor compliance with guidelines, particularly in areas such as declaration of registered study protocols, funding source declaration, risk of bias data, issues resulting from data abstraction, and description of clear study objectives.
[107][108][109][110][111] A host of studies have identified weaknesses in the rigour and reproducibility of search strategies in systematic reviews.
[126] A 1904 British Medical Journal paper by Karl Pearson collated data from several studies in the UK, India and South Africa of typhoid inoculation.
[128] Critical appraisal and synthesis of research findings in a systematic way emerged in 1975 under the term 'meta analysis'.
[129][130] Early syntheses were conducted in broad areas of public policy and social interventions, with systematic research synthesis applied to medicine and health.
[132] His call for the increased use of randomised controlled trials and systematic reviews led to the creation of The Cochrane Collaboration,[133] which was founded in 1993 and named after him, building on the work by Iain Chalmers and colleagues in the area of pregnancy and childbirth.