Background
Public health and health service interventions are typically complex. They are usually multifaceted, with impacts at multiple levels and on multiple stakeholders. Also, the systems within which they are implemented may change and adapt to enhance or dampen their impact.1 Quantitative syntheses ('meta-analyses’) of studies of complex interventions seek to integrate quantitative findings across multiple studies to achieve a coherent message greater than the sum of their parts. Interest is growing on how the standard systematic review and meta-analysis toolkit can be enhanced to address complexity of interventions and their impact.2 A recent report from the Agency for Healthcare Research and Quality and a series of papers in the Journal of Clinical Epidemiology provide useful background on some of the challenges.3–6
This paper is part of a series to explore the implications of complexity for systematic reviews and guideline development, commissioned by WHO.7 Clearly, and as covered by other papers in this series, guideline development encompasses the consideration of many different aspects,8 such as intervention effectiveness, economic considerations, acceptability9 or certainty of evidence,10 and requires the integration of different types of quantitative as well as qualitative evidence.11 12 This paper is specifically concerned with methods available for the synthesis of quantitative results in the context of a systematic review on the effects of a complex intervention. We aim to point those collating evidence in support of guideline development to methodological approaches that will help them integrate the quantitative evidence they identify. A summary of how these methods link to many of the types of complexity encountered is provided in table 1, based on the examples provided in a table from an earlier paper in the series.1 An annotated list of the methods we cover is provided in table 2.
We begin by reiterating the importance of starting with meaningful research questions and an awareness of the purpose of the synthesis and any relevant background knowledge. An important issue in systematic reviews of complex interventions is that data available for synthesis are often extremely limited, due to small numbers of relevant studies and limitations in how these studies are conducted and their results are reported. Furthermore, it is uncommon for two studies to evaluate exactly the same intervention, in part because of the interventions’ inherent complexity. Thus, each study may be designed to provide information on a unique context or a novel intervention approach. Outcomes may be measured in different ways and at different time points. We therefore discuss possible approaches when data are highly limited or highly heterogeneous, including the use of graphical approaches to present very basic summary results. We then discuss statistical approaches for combining results and for understanding the implications of various kinds of complexity.
In several places we draw on an example of a review undertaken to inform a recent WHO guideline on protecting, promoting and supporting breast feeding.13 The review seeks to determine the effects of interventions to promote breast feeding delivered in five types of settings (health services, home, community, workplace, policy context or a combination of settings).8 The included interventions were predominantly multicomponent, and were implemented in complex systems across multiple contexts. The review included 195 studies, including many from low-income and middle-income countries, and concluded that interventions should be delivered in a combination of settings to achieve high breastfeeding rates.