Article Text

Download PDFPDF

You’re invited: welcome to the dynamic world of quality improvement and implementation science
  1. Lisa M Goldthwaite1,
  2. Cati G Brown-Johnson2
  1. 1 Upstream USA, Boston, Massachusetts, USA
  2. 2 Department of Medicine, Stanford University School of Medicine, Stanford, California, USA
  1. Correspondence to Dr Lisa M Goldthwaite, Upstream USA, Boston, Massachusetts, USA; lgoldthwaite{at}upstream.org

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

As defined by the WHO, postpartum family planning (PPFP) focuses on the prevention of unintended and closely spaced pregnancies through the first 12 months following childbirth.1 Access to postpartum contraception supports both individual reproductive goals and population health outcomes, and much is known about the safety and efficacy of various contraceptive methods in the postpartum period. In the field of PPFP, as is the case with any field of medicine, clinical innovation and evidence accrue over time, resulting ultimately in the development of best practices and clinical guidelines. Even with best practices in hand, however, we know that moving from evidence to practice can take decades and is always shaped by local context.2 We also know that patient perspectives provide critical insights into the actual needs and real-life barriers within each specific local context.3

In this issue of the Journal, Hofmeyr and colleagues describe a quality improvement initiative in a large public hospital in Botswana, aimed at improving postpartum contraceptive counselling and provision, with specific foci on integrating immediate postpartum intrauterine device services into their practice setting and monitoring patient experience.4 We commend the authors for undertaking this study. This kind of detailed quality improvement work is worth publishing, and elevates accounts of quality improvement and implementation into the emerging knowledge base of implementation science.

Intuitively, we may understand that quality improvement is what it says it is - improving the quality of care. Implementation science is perhaps a next step in the continuum of quality - it is the science (represented by generalisable knowledge, principles, frameworks, theories and tools) of implementing interventions in real-world contexts.5 6 Rather than a randomised controlled trial, where we intentionally control all elements outside of our intervention, implementation science seeks to understand what supports or undermines an intervention in its specific context. Implementation science, like quality improvement, is all about the how. How is this intervention best delivered locally?

In their article, Hofmeyr et al only mention ‘implementation’ three times, but we see implementation science principles woven throughout, with emphasis on intervention champions (eg, ward clinicians providing mentorship) and resources (eg, posters used to support patient education and decision-making). Taking a step back, we find ourselves asking: Really, why does it matter if we call this quality improvement or implementation science? What extra lift does it give us as clinicians and researchers to engage with esoteric frameworks and theories of implementation science? We propose that engaging with implementation frameworks in particular can help practitioners do three important things (figure 1).

Figure 1

Steps for interacting with an Implementation Science Framework using the example of the WHO postpartum family planning (PPFP) programme model. *Reflect: Use the framework to reflect on what you have done and how it connects to other successful and unsuccessful attempts. In this example, boxes have been checked off on the framework to indicate areas taken into account with the project. **Gaps and opportunities: Consider the gaps and opportunities in the work you have done based on the existing framework. In this example, apparent gaps have been circled. ***Your contribution: What specific strategies from your setting might be generalisable and are not currently represented in the framework? Could those ideas be your contribution, incorporated into the next iteration of the framework to benefit others? In this example, one might add specific successful tools to the Activities box. Adapted from Figure 1 in the WHO publication ‘Programming Strategies for Postpartum Family Planning’.1

1. Reflect: Understand more about what you have done and how it connects to other successful and unsuccessful attempts.

  • Applying the WHO PPFP programme model to the work done by Hofmeyr et al, we can identify that their team emphasised the building blocks (‘facilitators’) of their health workforce, information, contraceptives and community. They also documented outcomes related to coverage, quality and contraceptive use.

2. Gaps and opportunities: Consider the gaps and opportunities in the work you have done.

  • Again, reviewing this WHO PPFP implementation framework, we see opportunities that are not currently reported around leadership, governance, financing, and patient involvement in design. What efforts could have been made in these areas or have been made around these issues since this project was initiated? How might early consideration of financing and patient preferences support sustainability of this programme?

3. Your contribution: Finally, as a good citizen of science, think about and document specific strategies from your setting that are not represented in the framework but might be generalisable, and could be your contribution to an evolution of the framework and further studies in quality and implementation science.

For Hofmeyr et al’s study, their detailed activities could contribute to this framework by providing more specific tools for the toolbox of possible interventions.

Focusing on patient inclusion efforts, we applaud Hofmeyr et al for using patient satisfaction as an aspect of their programme analysis. In any improvement work, talking with patients can ensure that changes ultimately result in care appropriate to the local context, and sustainability in terms of adoption and success, both of which are major implementation science considerations. Patient input can be gathered at many points in improvement work, including programme evaluation as we see in this example. We encourage those doing improvement work to consider also inviting patient representatives onto project teams early in project development, in order to incorporate their constructive insights from the beginning, thus centring patient voices throughout design and implementation.

In the American context, listening to patient and community voices in response to the last decade of postpartum contraceptive access initiatives has begun to inform an evolution in service delivery and primary project goals. Specifically, many voices are concerned that an overemphasis on long-acting reversible contraception (LARC) methods could have resulted in coercive practices and even reduced patient autonomy, in a space where centring patient goals and preferences is essential.7 Evaluation of programmes focusing on LARC uncovered elements of both subtle and overt bias that could have been mitigated by early and robust patient involvement.8 It is noteworthy that it is simply harder to implement LARC-related services in a space where they did not previously exist, as compared with other contraceptive methods. But while more resources may be necessary to scale up LARC services, keeping a wider frame with the ultimate goal of improving access to all safe contraceptive methods in the postpartum period positively serves to centre patient reproductive goals, needs, values and preferences.

Regardless of whether we call it quality improvement or implementation science, or whether we include patients from the start or in the final evaluation, what is ultimately important is that efforts are being made to improve the quality of care, and that as a scientific community we are communicating about those efforts. Sharing lessons learnt from quality improvement initiatives through publications amplifies these experiences. Using these findings to contribute towards the evolution of improvement science frameworks further widens the scope of impact. Incorporating patient voices throughout improvement work can serve as a guidepost to ensure high-quality patient-centred improvement. When you write about your experiences, we get to have a front row seat in your success and, perhaps even more importantly, we learn from any missteps. Do not let any fancy words throw you off the scent. When you are motivated to improve care around you, find a team and jump in. Write down what you did along the way. And when time and resources allow, we invite you to send us a note and tell us how it went.

Ethics statements

Patient consent for publication

Ethics approval

Not applicable.

References

Footnotes

  • Contributors LMG and CGBJ coauthored this editorial, working collaboratively to develop the concept and to outline, draft and edit the manuscript.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Provenance and peer review Commissioned; internally peer reviewed.