Evidence synthesis is a type of research method that allows researchers to bring together all relevant information on a research question. This can be useful to identify gaps in knowledge, establish an evidence base for best-practice guidance, or help inform policymakers and practitioners. There are many types of outputs that use evidence synthesis, such as policy briefs, systematic reviews, clinical practice guidelines and so on.
There are several organisations dedicated to supporting and collating systematic reviews with relevance to public health policy and practice. These include:
- The Cochrane Collaboration. This global organisation consists of around 40 review groups, each with a focus on a different health topic. Cochrane reviews are usually focused on the effectiveness of clinical interventions, and take a somewhat narrow approach to evidence synthesis. The Public Health group supports people from all over the world to conduct systematic reviews to the Cochrane model and standard. You can read more about the Cochrane library, review methods and software here.
- The Campbell Collaboration has a similar approach to the Cochrane Library, but focuses more on education, social welfare and development.
- The EPPI-Centre is part of the Institute of Education, UCL. This unit does systematic reviews for different government departments, with a wide variety of topics and methods. The Centre has also developed novel synthesis techniques, software, and review methods including participatory methods. They run training and seminars, see below for details.
- 3ie funds, produces, quality assures and synthesises rigorous evidence on development effectiveness. They support evaluations and reviews that examine what works, for whom, why, and at what cost in low-and middle-income countries.
- Centre for Reviews and Dissemination, York. CRD do health-relevant systematic reviews and have developed particular expertise in high quality systematic reviews and associated economic evaluations.
- The Health Evidence Network, WHO. HEN produces a variety of publications to meet policy-makers’ needs for evidence, synthesizing the best available evidence in response to policy-makers’ questions. These include joint policy briefs and policy summaries, produced with the European Observatory on Health Systems and Polices, which synthesize the evidence around specific policy options for tackling key health system issues; and HEN summaries of reports, including synopses of the main findings and policy options.
There are additionally many local and global networks for those interested in doing and using systematic review evidence in public health policy and practice; e.g. the Africa Evidence Network.
Synthesis methods
Synthesis methods can include bringing together published and grey literature and stakeholder input. There is a proliferation of terms used to describe different kinds of reviews; the most important thing is to be transparent and clear about your aims, methods, and reporting.
In the same way that methods used in primary research will depend upon the research question, the type of evidence review and synthesis method will also be determined by the question or topic it seeks to address. Thomas et al (2012) argue that there is a spectrum of synthesis methods, from more aggregative (trying to bring together a set of experimental findings on one or more outcomes) to configurative (trying to bring together diverse evidence to provide an overall picture of what the literature looks like in a particular area; more akin to assembling a jigsaw puzzle):
Figure 1. Methodological continuum of synthesis approaches and methods. Source: Adapted from Thomas et al. (2012).
Choosing a synthesis method will depend on the resources available to you, the question you’re asking, and the needs of your stakeholders. Some examples include:
Synthesis method | Description | Examples / resources |
---|---|---|
Content analysis | Coding and categorisation of qualitative / quantitative data | Use of content analysis to conduct knowledge-building and theory-generating qualitative systematic reviews
A systematic review and content analysis of bullying and cyber-bullying measurement strategies |
Framework | Highly structured data extraction and organisation, using an a priori framework | Attitudes to walking and cycling among children, young people and parents: a systematic review
Using framework-based synthesis for conducting reviews of qualitative studies |
Narrative synthesis/ review | Discusses diverse forms of evidence side-by-side, to generate new insights. | Public acceptability of government intervention to change health-related behaviours: a systematic review and narrative synthesis
|
Mapping / scoping review | Structured survey (open/closed-ended questions) of systematically identified study cohort | Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach |
Meta-analysis | Quantitative pooling of effect sizes across studies. Involves identifying findings, heterogeneity, categorisation, and grouping categories
Can include diverse data types (Bayesian) |
Mass deworming for soil-transmitted helminths and schistosomiasis among pregnant women: a systematic review and individual participant data meta-analysis
|
Meta-synthesis | Aggregating and / or configuring results from inter-related qualitative studies, aiming to develop theory and/or generalisable results
Can include:
|
Methods for the thematic synthesis of qualitative research in systematic reviews
Qualitative synthesis: A guide to conducting a meta-ethnography Storylines of research in diffusion of innovation: a meta-narrative approach to systematic review |
Realist | Identification of causal mechanisms that underlie interventions or programmes. Builds explanations across about why they work (or not) for particular groups in particular contexts. | Implementing health promotion programmes in schools: a realist systematic review of research and experience in the United Kingdom |
There have been recent calls to ensure that systematic reviews become more responsive to policy and public needs through greater inclusion and transparency. Some recent interesting papers on this topic include Open Synthesis: on the need for evidence synthesis to embrace Open Science by Neal Haddaway and Stakeholder involvement in systematic reviews: a scoping review by Alex Pollock and team.
Tools and Resources
The first step in a systematic review is often a research protocol – often required by funders. Consider publishing your protocol in a journal such as Systematic reviews or registering it with the PROSPERO database.
The EPPI-Centre has an online database of systematic reviews which can be searched here. They also have many methodological publications covering topics such as user involvement, text mining, information management, searching and synthesising studies, quality and critical appraisal, and using evidence.
Moving from systematic review evidence to making recommendations is not a straightforward process. The Grading of Recommendations Assessment, Development and Evaluation (short GRADE) working group began in the year 2000 as an informal collaboration of people with an interest in addressing the shortcomings of grading systems in health care. The working group has developed a common, sensible and transparent approach to grading quality (or certainty) of evidence and strength of recommendations. Many international organizations have provided input into the development of the GRADE approach which is now considered the standard in guideline development. GRADE now has dedicated software to enable the more transparent use of evidence in producing recommendations for public health policy and practice.
The EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network is an international initiative that seeks to improve the reliability and value of published health research by promoting transparent and accurate reporting and wider used of robust reporting guidelines. They publish reporting standards for all kinds of study designs and are a useful resource for critical appraisal techniques.
There are a number of resources to help support the translation of systematic reviews into policy decisions, but a useful guide is this report by the Alliance for Useful Evidence. It is a short introduction for decision-makers and researchers or anyone else considering whether a systematic review may be appropriate to fill a gap in knowledge or to use as a resource. It will help anybody planning on commissioning a review of what research is already out there.
For the coders, the SR Toolbox is a hub that brings together code and tools used by software developers with a specific interest in systematic reviews. There’s more on text mining and machine learning on the EPPI-Centre website too.
Events
The London Systematic Reviews and Research Use Seminars run 12.30-13.45 on the 3rd Tuesday of the month. These seminars aim to encourage discussion and information-sharing for those in the London area (though visitors welcome too) on methods issues in systematic reviews and the study of evidence use. Presentations will last for about 25-30 minutes to allow time for discussion.
The table highlights past events hosted by the Centre. Details on upcoming events can be found on our website. If you are interested in giving a seminar to our members or support hosting an event please contact evaluation@lshtm.ac.uk.
Title | Overview | Speaker | Link to recording |
---|---|---|---|
Evidence Standards and justifiable evidence claims | In developing findings and conclusions from their studies, researchers are making ‘evidence claims’. We therefore need to consider what criteria are used to make and justify such claims. This presentation will consider the use of evidence standards to make evidence claims in relation to primary research, reviews of research (making statements about the nature of an evidence base), and guidance and recommendations informed by research. The aim is to go beyond testing the trustworthiness (quality appraisal ) of individual studies to discuss the ways in which evidence standards are used to make evidence claims to inform decisions in policy, practice, and personal decision making. | David Gough (UCL) | Watch online |
Using realist synthesis to understand the effectiveness and implementation of health programmes | Realist synthesis is an approach to systematic review which explicitly aims to identify and refine programme theory – that is, develop explanations of how and why programmes ‘work’. While realist methods have been successfully used to understand the effectiveness of policies and programmes, to inform intervention design and development, we believe the overall logic of realist explanation can also be used to better understand the implementation and effective delivery of complex health interventions and programmes. After introducing the fundamentals of realist inquiry, and the approach of realist synthesis, we will present our realist synthesis of the implementation of health promotion in schools. As well as presenting and discussing the findings of this synthesis, we will share our reflections on the value, limitations and conduct of realist synthesis; the separability of explanations of ‘effectiveness’ and ‘implementation’; and how such syntheses might better inform process evaluations and implementation science. | Mark Pearson (PenCLAHRC) Rob Anderson (Uni Exeter) |
Watch online |
Qualitative Comparative Analysis: A worked example to explore why some weight management interventions worked better than others | Statistical methods for synthesising evidence of effectiveness depend on intervention replication in order to operate effectively. For syntheses in which each intervention may differ from another (in sometimes unknown ways), they are less suitable. By contrast, Qualitative Comparative Analysis (QCA), an approach which has recently been employed in systematic reviews, makes use of the inherent variance in complex interventions. QCA seeks to answer a different question to that asked by previous reviews, i.e. rather than ‘what works, on average’, it aims to understand the mechanisms through which different interventions have the impact that they do. QCA systematically identifies configurations, or combinations, of various intervention and other contextual characteristics that are (or are not) present when an intervention has been successful (or not) in obtaining a desired outcome. QCA allows for multiple overlapping pathways to causality, and it identifies combinations of conditions as opposed to isolating the effects of single characteristics on intervention effectiveness. This may better represent the complex causal pathways that often characterise complex social interventions. This session uses a worked example to outline the process of conducting a QCA and illustrate how the method was able to identify critical components of weight management interventions where other methods had been unsuccessful. |
Katy Sutcliffe (EPPI-Centre, UCL) GL Melendez (Uni of Warwick) Helen Burchett |
Watch online |
A meta-ethnographic approach to synthesising theory in systematic literature reviews | Pandora Pound Tara Tancred |
Watch online |
Training
The LSHTM runs a Short Course: Systematic reviews and meta-analyses of health research
Cochrane runs a range of online free / costed courses, as does Campbell. The EPPI-Centre runs an MSc in Systematic Reviews for Social Policy and Practice as well as a range of tailored shorter courses.
SHORT COURSES
1. Systematic reviews and meta-analyses of health research
This five day course runs in April each year. It provides participants with a basis in the design, analysis and interpretation of systematic reviews of health research. Participants will be given grounding in all aspects involved in conducting a systematic review and meta-analysis, and will have the opportunity to gain practical experience of the tasks involved. By the end of the course participants will be equipped with the necessary skills to conduct their own high quality systematic reviews.
Read more and register interest for 2020 online.
2. Conducting a Systematic Review: a practical guide run by Specialist Unit for Review Evidence (SURE), Cardiff University
A practical and highly interactive course designed to equip participants with an understanding of the systematic review process and an introduction to the skills necessary to conduct a review. The course is taught over four days with a range of discussion, group and hands-on sessions. Attendees should come to the course with a research topic and leave with a draft protocol for their systematic review.
By the end of the course, attendees should be able to:
- Develop a focused question
- Identify the evidence to answer that question
- Assess the quality/validity of the identified evidence
- Decide what form of evidence synthesis is most appropriate
- Present the results to meet the needs of healthcare professionals and other researchers
- Develop strategy to publicise the results
Target audience:
- Postgraduate healthcare researchers
- Librarians
- Healthcare professionals
- Policy makers