Original research

Navigating process evaluation in co-creation: a Health CASCADE scoping review of used frameworks and assessed components

Abstract

Background Co-creation is seen as a way to ensure all relevant needs and perspectives are included and to increase its potential for beneficial effects and uptake process evaluation is crucial. However, existing process evaluation frameworks have been built on practices characterised by top-down developed and implemented interventions and may be limited in capturing essential elements of co-creation. This study aims to provide a review of studies planning and/or conducting a process evaluation of public health interventions adopting a co-creation approach and aims to derive assessed process evaluation components, used frameworks and insights into formative and/or participatory evaluation.

Methods We searched for studies on Scopus and the Health CASCADE Co-Creation Database. Co-authors performed a concept-mapping exercise to create a set of overarching dimensions for clustering the identified process evaluation components.

Results 54 studies were included. Conceptualisation of process evaluation included in studies concerned intervention implementation, outcome evaluation, mechanisms of impact, context and the co-creation process. 22 studies (40%) referenced ten existing process evaluation or evaluation frameworks and most referenced were the frameworks developed by Moore et al (14%), Saunders et al (5%), Steckler and Linnan (5%) and Nielsen and Randall (5%).

38 process evaluation components were identified, with a focus on participation (48%), context (40%), the experience of co-creators (29%), impact (29%), satisfaction (25%) and fidelity (24%).

13 studies (24%) conducted formative evaluation, 37 (68%) conducted summative evaluation and 2 studies (3%) conducted participatory evaluation.

Conclusion The broad spectrum of process evaluation components addressed in co-creation studies, covering both the evaluation of the co-creation process and the intervention implementation, highlights the need for a process evaluation tailored to co-creation studies. This work provides an overview of process evaluation components, clustered in dimensions and reflections which researchers and practitioners can use to plan a process evaluation of a co-creation process and intervention.

What is already known on this topic

  • There is a growing recognition of the value of process evaluation.

  • The absence of process evaluation frameworks built to suit the context of co-creation makes it unclear whether they are adequate for this specific context.

What this study adds

  • The results demonstrate a fragmented interpretation of process evaluation in the context of co-creation.

  • Most assessed process evaluation components relate to participation, context, experience of co-creators, impact, satisfaction and fidelity.

  • The majority of studies do not reference existing process evaluation frameworks, with the UK Medical Research Council Guidance being the most referenced framework.

How this study might affect research, practice or policy

  • The study highlights the need to enhance existing process evaluation frameworks with additional characteristics and components relevant to co-creation.

  • The study suggests considering both the co-creation process and intervention implementation as interventions and conducting process evaluations for each.

  • The study recommends the use of formative evaluation.

Introduction

Co-creation is advocated as a means to develop solutions (e.g., an intervention to improve public health) which meet the needs and wishes of the population of interest and other relevant stakeholders, by embracing a collaborative approach of innovative problem-solving. This approach includes the involvement of a wide range of stakeholders throughout all phases of a project,1 from identifying or defining the problem to the project’s concluding stages2 to co-create effective and sustainable solutions that align with the needs and preferences of all relevant stakeholders.3 It has been considered a promising approach to increase the effectiveness and impact of public health interventions and to contribute to the closing of the implementation gap,4 particularly valuable in the context of marginalised communities.4 5

However, co-creation risks tokenistic and ineffective applications without a rigorous methodology.

Process evaluation especially has been regarded as crucial to contextualise, explain and increase the science behind public health interventions.6 Its understanding has evolved over time. In its early stage, it primarily involved the assessment of implementation through the analysis of quantitative process indicators for interpreting the results obtained from effectiveness studies. Later, there was increased recognition of the need for qualitative research alongside trials to place greater value on the context, acceptability of an intervention and implementation issues.7 This understanding of process evaluation is exemplified by the framework of Saunders et al,8 which focuses on capturing the intervention implementation aspects, such as fidelity to the protocol, the number of intervention activities implemented and topics intended covered, attendance rates, recruitment procedures and contextual factors that may have affected the intervention implementation.

Since 2010, process evaluation expanded its scope to include the exploration of mechanisms of impact. For instance, through the British Medical Research Council (MRC) guidance,9 authors propose understanding process evaluation as a way to not only report on intervention implementation but also as an opportunity to explore elements that may help to explain how a certain impact has been achieved. Process evaluation is described by the MRC guidance and recent studies as a way to assess fidelity and quality of implementation, clarify causal mechanisms and identify contextual factors associated with variation in outcomes.10–12 It is defined to be applicable and valuable to the stages of intervention development and implementation.9

Applied to co-creation, an evaluation of the process is crucial both at the development stage (ie, co-creating the intervention) and at the implementation stage (ie, implementing the co-created intervention). At both stages, a process evaluation can serve as a way to identify areas of improvement, ensure that the diverse perspectives and contributions of stakeholders are meaningfully integrated and that co-creators are experiencing a sense of joint ownership.13 It allows for the co-creation efforts to evolve and become more effective in addressing public health issues by meeting the needs and wishes of the communities and individuals involved.13 Despite being crucial to ensure a meaningful practice and an evidence-based assessment of the co-creation process and developed solution/intervention, no process evaluation framework has yet been designed explicitly for the context of co-creation. Being co-creation an underused yet emerging approach in public health,1 3 14 we observe a lack of evaluation frameworks that account for essential aspects in the co-creation process15 and that align with the most recent literature on co-created public health interventions.16 Despite being crucial to ensure a meaningful practice and an evidence-based assessment of the co-creation process and developed solution/intervention, no process evaluation framework has yet been designed explicitly for the context of co-creation.

For this reason, this review aims to explore how process evaluation is conceptualised, planned for and conducted in the context of co-creation, by providing an overview of process evaluation conceptualisations, used evaluation frameworks and components assessed at both the stages of development and implementation. It represents the background and exploratory work on the ways in which process evaluation is conducted in co-creation projects that will serve us to publish recommendations in our follow-up study.

Furthermore, several studies applying a co-creation approach have highlighted the importance of ensuring stakeholders’ perceptions and experience of the process are captured and guiding the intervention itself and/or adjustments and adaptation during the co-creation.17 18 This type of formative evaluation has been previously regarded as valuable in the context of co-creation and participatory research approaches.18–20 Hence, this review additionally aims to explore the extent to which included studies had planned for or conducted a formative evaluation, and, therefore, conducted, analysed or reported back evaluation results during the process to provide feedback to the co-creators and/or research team to adapt or improve the process.21

Finally, as engagement with the population of interest and stakeholders in co-creation processes is assumed to be happening throughout,2 in this study, we are interested in exploring the extent to which included studies planned for or conducted a participatory evaluation as part of the process evaluation. Participatory evaluation is described as a type of evaluation approach in which stakeholders are involved in the design of the evaluation, the data analysis or reporting.22

Overall, this study seeks to provide a review of studies planning and/or conducting a process evaluation of public health interventions adopting a co-creation approach and aims to derive assessed process evaluation components, used evaluation frameworks and to assess the extent to which studies conducted formative and/or participatory evaluation.

Methods

This research was conducted in two parts. First, we conducted a scoping review to identify frameworks and components used in the evaluation of a co-creation process and implementation of the related co-created interventions. Then, concept mapping23 was applied to identify a set of overarching dimensions to cluster the identified components.

Search strategy

This scoping review followed the PRISMA-ScR (Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews) guidelines.24

We searched the Health CASCADE and SCOPUS databases with the same search strategy of including co-creat* OR co-creat* AND process AND evaluation. The Health CASCADE database is a recently published open-access database including peer-reviewed articles about co-creation across various fields.1 It was produced within the Health CASCADE project, a European-funded project aiming to develop the methodological foundation of evidence-based co-creation.25 The search on both databases was conducted with no time or language limitations. Following the database search, articles were exported into a CSV file to remove duplicates in Excel. The articles were then imported and screened in Rayyan.

Process of selection

All studies were doubled-screened by several reviewers at title and abstract (GRL, JdB, KG, DMA, LM and SC) and at full-text (GRL, JdB, KG, DMA, QA, TA, MV and MG-G) and irrelevant studies were removed against the agreed set of criteria. Differences of opinion regarding inclusion or exclusion were resolved by discussion and reaching consensus and, if not applicable, by the involvement of a third reviewer (GRL, JdB, KG, DMA, QA, TA, MV and MG-G).

Eligibility criteria

In line with the recommendations of Levac et al, 26 the criteria for study inclusion were refined through iterative discussion among the research team. Articles were included if they complied with the definition of co-creation intended as ‘an evidence-based methodology for the development, implementation and evaluation of innovations through continuous, open collaboration, interactional knowledge production and shared decision-making among key stakeholders, directed at improving public health’.27

We included studies that explicitly mentioned planning or conducting a process evaluation of (a) the co-creation process at any of the intervention/project stages (eg, the engagement with relevant stakeholders in the needs analysis; intervention development) and/or (b) the implementation of the co-created interventions (eg, how the co-created intervention was carried out and received, and examining its fidelity, quality and acceptability). Included studies related to the public health field, defined as all organised measures (whether public or private) to prevent disease, promote health and prolong life among the population as a whole.28 All studies included had to be empirical studies, that is, gathering data based on experience, observations or experimentation.29

Full inclusion criteria for title and abstract screening and full text can be found in online supplemental file 1.

Data extraction

A template was developed in Excel to facilitate the extraction of information about included articles (see online supplemental file 2) and include data related to the definition of process evaluation, if applicable; frameworks used to guide the evaluation, if applicable; process evaluation components and on whether included articles conducted a formative evaluation21 or a participatory evaluation.22 We also extracted information related to the components assessed as part of the process evaluation. All data were independently extracted by two reviewers (GRL, JdB, KG, DMA, QA, TA, MV and JRZR), and, in case of discrepancies, MG-G and GRL were involved, and consensus was reached for the final extraction.

Data analysis

To synthesise research findings related to the identified components, the extracted components were clustered by the first author (GRL) according to similarities. For instance, if we encountered components that were extracted and labelled as ‘facilitation’, we clustered these together with any related components that shared a similar thematic element, such as ‘facilitation of patients’ involvement’. In case of uncertainty, the last author (MG-G) was consulted.

In order to synthesise the identified components into a visually accessible format and to provide a structure to the results, we aimed to delineate a set of dimensions encompassing all individual components. To do so, all co-authors participated in three iteration rounds of consensus-making. First, to identify overall dimensions, co-authors were invited to independently group components and assign a name to each cluster via the online programme Trello.com. Each cluster would represent a dimension. Second, during an in-person meeting, using all dimensions that were drafted individually as a base, co-authors, as a group, sought consensus on a set of final dimensions.

Once dimensions were set, co-authors were asked individually to sort all components into the identified dimensions via the same online programme. We set a consensus threshold, which required that more than 50% of the co-authors must agree on the placement of each component within a specific dimension. More than 50% agreement was obtained for all sorted components.

Results

By reviewing and analysing included studies, this review provides an overview of how process evaluation was conceptualised and conducted in co-creation projects. It achieves this by describing included studies, frameworks used and any adaptations made to those frameworks and reporting on the assessed process evaluation components.

From the original total hit of 1882 articles, 119 duplicates were removed and 1615 were excluded at title and abstract screening. 79 articles were excluded after the full-text screening, resulting in the inclusion of 54 studies. The PRISMA extension for scoping review guidelines has been used to present the screening process (figure 1).

PRISMA-ScR flow.

Overview of included studies

Online supplemental file 3 shows the included studies and details about the authors, publishing year, the country in which the study was set and specifies whether the study applied formative evaluation and/or participatory evaluation. The majority of included studies were conducted in the USA (18%), followed by Canada (7%), the Netherlands (5%) and the UK (4%), with the rest spread across various countries, each representing 1%–3% of the total. Studies primarily focused on obesity prevention (7%) and mental health (4%). Other topics included nutrition, physical activity, workplace wellness and various public health issues such as HIV prevention, breast cancer, drug use, occupational health and more (1%–3% of the total).

All included studies were published between 2002 and 2022. There was an increase in publications between 2003 and 2017, with a peak of eight in 2016 and seven in 2017. Subsequently, from 2018 onward, there has been a continued growth in publications.

Formative and participatory evaluation

13 studies (24%) conducted formative evaluation during either the co-creation process or the intervention’s implementation and 37 (68%) conducted process evaluation after the intervention’s implementation (ie, summative evaluation). Two studies conducted a participatory evaluation (3%) while the remaining (97%) did not.

When it comes to formative evaluation, several authors identified potential implementation barriers and facilitators to support future adaptations or iterations of the intervention implementation.30–35 In some studies, the research team asked participants to reflect on perceptions related to the participants’ engagement36 or expectations,37 to adapt, if redeemed as necessary, the following intervention’s sessions, such as workshops and/or activities.36 38

Two studies conducted participatory evaluation in different forms. Gibbons et al 39 reported that, when presenting to the group, interested partners iteratively shared their thoughts, concerns and suggestions regarding the findings and the interpretation of the findings. More comprehensively, Harper et al 33 engaged with community representatives right from the planning of the process evaluation up to the choice of evaluation methods and strategies, in accordance with both community sensitivity and scientific rigour, up to the interpretation of findings.

Process evaluation conceptualisation

24 studies (44%) did not explicitly define process evaluation. 30 studies (56%) included an explicit definition of process evaluation within the manuscript, whether it was by referencing an existing study or by providing a definition themselves. Definitions of process evaluation provided by the included studies are available in online supplemental file 4, including extracted quotes.

The manner in which process evaluation aims were described across studies provides insights into how the conceptualisation of process evaluation varied across studies. While the evaluation of intervention implementation is taken into account by the majority of the studies, several authors focused on other elements, including outcome-related evaluation,33 35 40 41 mechanisms driving impact,34 42–53 the contextual factors at play42 43 47 49 54–56 and the co-creation process.31 37–41 48 49 51 54 57–59

Several studies referred to process evaluation as the monitoring and reporting of intervention implementation and delivery. In these instances, process evaluation strives to paint ‘a clear, descriptive picture of the quality of the programme elements being put into place and what is taking place as the programme proceeds’.58 60 Parker et al,61 for instance, used process evaluation to gauge the extent, fidelity and quality of the intervention implementation. Similarly, Sormunen et al described process evaluation as ‘a process through which to report on structure and activities of the programme or intervention’.50

Four authors included an evaluation of outcomes as part of their process evaluation, defining this in several ways, including as a process in which you may analyse the ‘outcomes of the process used in the intervention’40 and ‘as a way to establish whether the partnership and project activities have been as intended and resulted in the expected outputs’.41 Magnusson et al described process evaluation as the procedure which ‘will monitor the processes in terms of reaching the intended outcome’35 while Harper et al described process evaluation’s goal as ‘to clarify anticipated outcome goals and criteria used in outcome evaluations that measure a programme’s relevance and accomplishments’.33

Authors in the included studies have also aimed to comprehend impact mechanisms as part of their process evaluation34 42–53. Studies conducted by Steckler and Linnan62 described process evaluation as the mechanisms that shed light on why some interventions produced the intended results, and why others did not. Studies citing Moore et al 9 stressed the importance of examining the nature of what was implemented in practice and understanding the context around the intervention outcomes to inform future programmes. In this light, process evaluation is said to allow ‘to draw inferences about future applicability in the current setting and about generalisability and transferability to other settings’.63 Anselma et al,64 among others, stressed process evaluations should help to gain a deeper understanding of the more and less effective elements of interventions, as well as facilitators and barriers to the intervention’s maintenance/sustainability.

The intention to capture mechanisms of impact ties in with the evaluation of contextual implementation barriers and facilitators. To try to capture mechanisms of impact, studies stressed the relevance of assessing contextual factors that may be influencing the co-creation process and intervention. Gathering insights about the intervention’s context, as part of the process evaluation, is seen as a way to ‘understand how and why the programmes work, and under what conditions’.46 Similarly, Palmer et al,65 citing Glasgow et al 66 described the process as the capturing of information about emerging barriers and facilitators to change implementation and to identify contextual (organisational and environmental) factors that affect the intervention.

Lastly, several studies refer to an evaluation of the co-creation process and related aspects when describing process evaluation.31 37–41 48 49 51 54 57–59 Fusari et al,43 for instance, highlighted the use of process evaluation as a way to learn about the engagement mechanisms of participants and stakeholders to unveil insights around impact mechanisms that may be necessary for scale-up. Tolma et al 58 included the intention, as part of their process evaluation, to evaluate stakeholders’ reactions, such as, for instance, ‘the level of participation among intended recipients to the programme and reactions of the intended recipients to the programme’.58 Greer et al 40 and Anselma et al 64 included the assessment of enabled capacity building and empowerment as a result of the engagement and as part of their process evaluation.

Frameworks used

Eight studies (14%)42 43 47 49 54–56 cited evaluation or process evaluation frameworks developed by Moore et al, 9 three studies (5%) 62 ,48 51 58 62 cited Steckler and Linnan,62 three (5%) 35 67 68 cited Saunders et al, 8 three (5%)63 69 70 cited Nielsen and Randall;55 two studies (3%)40 71cited Greer et al, 40 and two (3%)64 65 citied Glasgow et al, 66 one study (1%)34 cited Damschroder et al 72, one study (1%)73 Nielsen and Abildgaard,74 one study (1%)59 Rowe and Frewer75 and one study (1%)76 Grant et al. 77

Table 1 presents, in order of highest to lowest number of cited times, details of the frameworks that were used in the included studies to guide the process evaluation, including the modifications to the original framework.

Table 1
Frameworks used in studies to guide process evaluation

Most studies adapted frameworks to include evaluation elements that refer to the co-creation process and related experience, perceptions with the implementation intervention and co-creation process.59 64 67 67 Additions to the MRC guidance9 included evaluation elements related to the participants’ experience of engaging in the co-creation process and/or intervention implementation. To the MRC guidance, Cedstrand et al 55 integrated Nielsen and Randal’s framework,69 while Fusari et al 43 included the use of the logic model.

To Nielsen and Randall’s framework,69 Yeary et al 51 included the assessment of acceptability and satisfaction with the intervention components and awareness of the intervention, while Tolma et al 58 further looked into barriers to intervention maintenance. Yeary et al 51 also added evaluation elements related to the acceptability of intervention components (satisfaction) and the intervention reach (awareness of the intervention).

Dimensions

Figure 2 presents a visual representation of identified process evaluation components clustered in overarching dimensions.

Overview of dimensions and components identified through included studies.

Process evaluation dimensions

Each dimension and component may apply to both the co-creation process and the implementation of the intervention.

‘Delivery’ components measured the degree to which the co-creation process and/or intervention implementation was delivered as intended. It includes the reporting of the number of co-creation and/or intervention sessions (e.g., workshops) delivered, the number of participants involved, etc and reports on changes concerning the original protocol. The dimension of ‘delivery’ encomp asses the following process evaluation components: delivery, dose delivered, adherence, adaptation, dose received, exposure and fidelity.

‘Participation’ includes components assessing the extent to which individuals or groups have engaged with and participated in the co-creation process and/or implemented intervention. It included components measuring the level of involvement and active engagement of the population of interest and/or end-users during the co-creation process and/or in the intervention, including the self-perceived degree of shared ownership and commitment. The latter may be observed and reported by facilitators and/or reported by participants. The dimension of ‘participation’ encompasses the following process evaluation components: participation, motivation, retention, facilitation, methods, partnership and recruitment.

‘Experience’ captures components measuring and assessing the subjective perception and evaluation of co-creation process and/or the implementation of the intervention by the individuals or groups who participated in it. It includes the assessment of (a) the experience related to the co-creation process and/or (b) the overall experience and involvement with the intervention implementation and actions. The dimension of ‘experience’ encompasses the following process evaluation components: acceptability, expectations, perceptions and satisfaction.

‘Context’ relates to components that are intended to examine the broader social, cultural, economic and political factors which create the system that can impact the success or failure of the intervention. The purpose of evaluating context might be to (a) understand the systemic factors which have influenced the public health issue that matters, (b) help ensure that the co-creation process and intervention is appropriately tailored to the specific context in which it is being implemented and (c) understand which environmental factors have had an impact on the co-creation process or intervention implementation. The dimension of ‘context’ encompasses the following process evaluation components: mapping, context, feasibility, readiness for change, support and resources.

‘Maintenance’ includes components that assessed the extent to which the intervention outcomes and/or relationship formed during the co-creation process and/or implementation of the intervention are being maintained. The dimension of ‘maintenance’ encompasses the following process evaluation components: maintenance, retention and future organisation.

‘Impact’ relates to components assessing the extent to which the co-creation process and/or implementation of the intervention has achieved one or more of its desired outcome(s) and its overall impact, including, for example, empowerment, self-reported or reported attitudes and/or changes towards the targeted health behaviour, self-perceived increase of well-being, awareness and satisfaction related to the participation in the process. The dimension of ‘impact’ encompasses the following process evaluation components: mechanisms of impact, impact, adoption, empowerment, capacity building, knowledge integration and evidence, communication, policy change and reach.

Process evaluation components

Among the most evaluated components are participation (26, 48%), context (22, 40%) and experience of co-creators (16, 29%), together with impact (16, 29%), satisfaction (14, 25%) and fidelity (13, 24%). Descriptions of each component are explicated below. Other components, in order of frequency of use, include the following: recruitment, reach, dose delivered, readiness for change, delivery, empowerment, motivation, dose received, support, capacity building, perceptions, maintenance, facilitation, communication, adherence, feasibility, exposure, adoption, adaptation, knowledge integration and evidence, resources, future organisation, policy-change, partnership, methods, expectations, acceptability and retention.

We describe below the most evaluated components (>23%), namely participation, context and experience of co-creators, impact, satisfaction and fidelity. A description of all components, as intended by the authors of the included studies, including the frequency of use, can be found in online supplemental file 5.

Participation

26 studies assessed participation as part of their process evaluation, including the extent to which individuals or groups who were the target of the intervention engage with and participate in the co-creation process and/or implementation of the intervention. Studies assessed the nature and degree of participation,37 78–80 and more specifically, whether it was voluntary, that is, the extent to which there was a voluntary shift of responsibilities from providers to users80 or equitable, ensuring all experiences were listened to, respected and represented at the table.30 45 71 81 Some assessed the extent to which there was continued or early engagement of communities throughout the process,45 59 78 82 including whether the objectives were set out and agreed by stakeholders at the start of the process,45 whether they had the chance and time to discuss and continuously revise the action plans30 73 or whether participants agreed they were targeting the most important problems in the intervention.73 83

Studies also specifically measured the participants’ involvement in decision-making,82 participants’ feelings regarding the transparency of the process82 occurrence of joint actions to meet community needs,60 the extent to which participants feel joint ownership63 or shared responsibility for the intervention.70 Studies also assessed the perspectives of participants on the process70 84 and, specifically, as to whether they have felt involved in the intervention,63 have established a trustful and open relationship with the working team45 85 and how they perceived the impact or accomplishment of the engagement process.39 Clark and Laing86 assessed the value of knowledge of exchange while participating. den Broeder et al 87 looked at perceived factors facilitating or hindering the development of consensus and perceptions of the level of perceived consensus and actual consensus.

Other studies evaluated the benefits and barriers39 88 89 and implementation determinants related to the engagement process.79 Kelly and Van Vlaenderen78 focused on assessing the degree to which the communicative problematics of participation have been identified and dealt with in a project. Dennehy et al 90 used Lundy’s Model of Participation,91 to operationalise participation, focusing on the evaluation of perceptions related to the creation of an inclusive and safe space for children, facilitation, extent to which their views are listened to and acted on.

Context

22 studies reported an assessment of context as part of their process evaluation examining the broader social, cultural, economic and political factors impacting the success or failure of the intervention in a specific context.

Studies mostly evaluated the contextual factors that might impact or have impacted the intervention planning and implementation.42 51 67 68 73 A wide range of approaches to the definition of context were used. Reeve et al 49 assessed context as the larger social, political and economic environment that may influence the implementation of an intervention. Igel et al 47 included the evaluation of existing social, health and environmental issues while Schelvis et al 92 explored the organisational and the environmental characteristics that affect the intervention. Tolma et al 58 reviewed aspects related to the larger social, political and economic environment and Gensby et al 46 highlighted the importance of considering the political-administrative context in which rehabilitation programmes are practised. Robertson et al 56 focused on broader community and environmental factors, such as socioeconomic considerations and community participation.

Studies explored implementation barriers and enablers,31 45 58 93 94 some focusing specifically on existing organisational structures, professional values or sociopolitical context that enable successful implementation,95 96 environmental factors,30 resources available52 56 or events that occurred and influenced the content of the execution of the action plan.63 Beckerman-Hsu et al 76 also specifically looked at moderators and the extent to which their role impacts implementation.

Authors have also mapped the characteristics and distribution of a specific population or health issue in a particular geographical area. Authors identified, analysed and considered the systematic representation of relevant stakeholders,45 96 aimed to clarify context, processes and activities,96 to understand the community85 and to identify the contextual and procedural drivers of any wanted change.57

Experience

16 studies evaluated the experience of participants and assessed the subjective perception of individuals or groups who participated in the co-creation process and/or intervention implementation. The majority of the studies48 54–59 assessed overall experience and involvement with the implemented intervention and actions while others31 55 60 evaluated how the participants specifically experienced the participatory process, or the coordination and collaboration in the process.59

Impact

16 studies assessed impact-related measures related to the extent to which the intervention had achieved one or more of its desired outcome(s) and its overall impact. This included evaluating the impact of the intervention on the collaborative and equitable involvement of its members,97 patient health and well-being,98 employee engagement and participation in work,99 line manager attitudes and actions,92 and personal impact on advisory group members.90

Reeve et al 49 evaluated patients’ perceptions of the overall impact they perceived as a result of taking part in the intervention. Heggdal et al 98 specifically reported on whether the intervention had the intended effect on patient health and well-being and whether the intervention had prompted individuals to be more active or had led to changes in their health behaviours.83 84 92

Others have evaluated the institutional and organisational changes taking place among and beyond the group of participants57 92 99 and outcomes that were a result of the engagement process between several parties involved.61 79 100 Chrisman et al 60 assessed the concrete achievements of the intervention, such as the number of publications, programmes, evaluations and grants that have been produced.

Some studies focused on evaluating mechanisms of impact and examined how the intervention produced its intended outcomes. Some studies aimed to identify the specific causal mechanisms or pathways that linked the intervention to the observed changes in health-related behaviours, health outcomes or other targeted outcomes42 47 and one study specifically looked at factors and mechanisms which contributed to citizen participation and intersectoral collaboration.101

Satisfaction

14 studies assessed the level of satisfaction among the participants and/or end-users who received or participated in a co-creation process and/or public health intervention. The evaluation of satisfaction was assessed through the overall intervention, its design and implementation, partnership, research process, products, team building process and dialogues, as well as the progress of the co-creation group.

Satisfaction was evaluated in various aspects of the intervention, such as the overall intervention,50 63 67 84 97 102 design and implementation102 and more specifically, the partnership,97 the research process,97 products97 or team building process102 and dialogue103 and the progress of the co-creation group.84 Some studies assessed satisfaction with specific stages of the process, including satisfaction with the needs assessment phase and the developed action plan.63 Lelie et al 70 registered satisfaction with the appropriateness of tools and materials, intervention activities and intervention approach. Schelvis et al 92 aimed to capture satisfaction levels with the participatory process.

Fidelity

Fidelity was assessed in thirteen studies and refers to the process of measuring and assessing the extent to which an intervention was delivered as intended, according to the original programme design or protocol. Studies evaluated fidelity by determining whether the intervention was implemented consistently and faithfully across different settings and to identify any variations or adaptations that may have been made during implementation.32 42 46 51 55 58 61 63 67 68 92 104

Discussion

Broadening the scope of process evaluation for co-creation

The increased number of publications on process evaluations of co-creation projects included in the current review not only indicates a growing interest in the field but also a recognition of its potential benefits and relevance. However, the field of process evaluation in co-creation is to be researched further. As previous reviews recommend,105 106 it is yet to be understood why process evaluation frameworks are so scarcely applied. The results from the current review align with those of two separate reviews on the use of process evaluation by Lazo-Porras et al 105 and Liu et al 106 in chronic and neglected tropical diseases in low-income and middle-income countries and in primary care interventions addressing chronic disease. Both studies indicate a low percentage of included studies that reference existing frameworks in process evaluation (12% and 31%, respectively). Among recommendations for the use of process evaluation in the study by Lazo-Porras et al,105 was to standardise reporting to ensure consistency and comparability among studies.

Echoing the above-mentioned results and recommendation, the results of this review highlight the importance of addressing the need for a standardised process evaluation specifically designed for co-creation. Such evaluation should capture essential co-creation elements as part of the co-creation process as well as part of the implementation of the co-created solution. An evaluation of the co-creation process would need the inclusion of specific elements, such as an assessment of the active collaboration with the stakeholders, the experience, facilitation and levels of participation. The process evaluation carried out by the included study by Dyer et al 59 illustrates this by focusing in-depth on an evaluation of the engagement and participation of co-creators in the co-creation and implementation process. Authors include valuable evaluation elements which relate to the following aspects: (a) the early engagement of communities in the process; (b) identification, analysis and systematic representation of relevant stakeholders; (c) clear objectives set out and agreed by stakeholders at the start of the process; (d) continued engagement of communities throughout process; (e) relevant methods chosen and tailored to the context, (f) participants and level of engagement; (e) highly skilled facilitation of the process; (f) integration of local and scientific knowledge; (g) open and meaningful information exchange and interaction with face to face; (h) transparency, trust and fairness; (i) equality among stakeholders and (l) the competent management throughout process.

Most importantly, this review has surfaced a growing trend of bringing the co-creation process into the conceptualisation of process evaluation.31 37–41 48 49 51 54 57–59 Studies have done this by incorporating co-creation elements in existing process evaluation frameworks,59 64 67 67 including an assessment of experience34 48 49 54–56 70 107 108 and components related to participation.25 30 37 39 45 60 63 70 71 78 80 81 85 87 Placing value on the co-creation process and its evaluation might entail having to consider the co-creation process an intervention in itself, with its own impacts and process evaluation. An evaluation of the co-creation process might be crucial as strictly linked to the implementation of the co-created solution. Equally valuing the process of co-creation and intervention implementation may enable us to grasp a more complete picture and to explore the relation between the process which co-created the solution (e.g., intervention) and the implementation of the solution/intervention itself.

Participatory evaluation and formative evaluation

Despite participatory evaluation being considered a potentially recurring approach to process evaluation, very few studies have done so (3%). We speculated that this could be attributed to potential challenges associated with its implementation, including the additional time it may require from participants and the possibility that it may not be perceived as highly significant by the studies that have included it. More guidance might be needed on how to conduct participatory evaluation in a way that is relevant to the stakeholders and adherent to co-creation principles. One first step might be, as done in the included study by Anselma et al, 53 to share the effect and process evaluation plan and ask the population of interest, in this case children, to reflect on the proposed measures and to suggest potential additional evaluation outcomes or methods.

13 studies (24%) have been found to adopt a formative evaluation approach. Formative evaluation has been thought useful for the identification and resolution of potential issues that could hinder the intervention’s implementation and/or related solution development109 and as an opportunity to explore whether the intervention is addressing a significant need, using ongoing input for short-term adjustments and to detect and adjust, if needed, to unanticipated events and local adaptations.109

Formative evaluation, especially in the context of co-creation, has been considered valuable when pinpointing the population of interest and stakeholders’ feedback regarding the co-creation process, the implementation and tailoring implementation strategies.19 20 It may be particularly significant as a way to gauge stakeholders’ active participation and ensure their perspectives are comprehensively captured and integrated into the intervention and ensure a successful intervention18–20 and allow for the intervention implementation fine-tuning, ensure it is closely aligning with stakeholders’ insights, feedback and concerns.35

Formative evaluation may be considered a characteristic inherent to co-creation, as the process is considered highly iterative.13 This inherent iteration nature built within co-creation might represent a challenge when it comes to the evaluation of fidelity. A challenge might be faced if formative evaluation is either not reported, as this usually happens more informally, or avoided altogether, particularly in the case of well-controlled randomised trials, which may typically refrain from postapproval alterations.109 As co-creation adopts an approach which is receptive to stakeholders’ context and feedback, the intervention should not solely be reporting adherence to predetermined steps but also valuing and adapting, when possible, to the lived experience, knowledge and values of the co-creators.

To be able to measure the extent to which formative evaluation activities exert influence on the implementation, thoroughly reporting modifications becomes essential. It is, therefore important, in this respect, that the intention of formative evaluation is explicated and reasons for and applied modifications are reported, including why and how formative was collected, used, by whom and to what extent it was integrated in the modifications.109

Recommendations for future research

Through a search of the published literature, this is the first scoping review of process evaluations planned or conducted in the context of co-creation for public health. Findings from this study lead to several implications for the field of process evaluation for co-creation.

First, the incorporation of extra elements into existing process evaluation frameworks and focus on process evaluation components related to the co-creation process, such as experience, participation and satisfaction, suggests that the existing process evaluation frameworks may fall short in comprehensively evaluating the co-creation process. It is important also to recognise, as expressed throughout the manuscript, the importance authors have placed on components related to context and mechanisms of impact.

Second, placing a focus on the co-creation process may necessitate valuing the co-creation process as an intervention in itself. Equally, valuing both the co-creation process and the intervention implementation as distinct interventions and conducting process evaluations for both may help to provide a more comprehensive picture of co-creation.

Third, the high percentage of use of formative evaluation throughout included studies may suggest that this is key to the context of co-creation processes and may help account for the iterative nature of the approach and adapting the co-creation process and intervention to the co-creators’ lived experience, knowledge and values. Conversely, the limited use of participatory evaluation by included studies may suggest either a lack of relevance or constraints in its practical implementation. This scoping review is conducted as part of the Health CASCADE project and findings will be used to inform the development of further guidance on planning and evaluating co-creation for public health. The authors involved in the guidance development will expand on components identified, recommend methods for evaluation and include practical examples to support researchers and practitioners.

How to use this review?

We see this review as serving three distinctive objectives. First, to provide an overview of existing conceptualisations related to process evaluation and frameworks used to guide the planning of process evaluation for co-creation. Second, to identify process evaluation components that previous studies took into account, to get a sense of what was valued as part of their planned or conducted process evaluation of co-creation. Lastly, the review seeks to facilitate reflection on process evaluation components that researchers and practitioners could consider when planning for the process evaluation of co-creation in the field of public health.

Study limitations

First, the framework modifications detailed in table 1 stem from our subjective understanding of the components and may not have been explicitly reported as modifications in the included studies. Second, each identified process evaluation component described in online supplemental file 5 is presented as described by the authors of the included studies. No modifications have been made to the clustering and description of identified process evaluation components to portray accurately what had been done and how components were intended by the included authors. Finally, even though a >50% agreement sorting rule was set, some co-authors expressed the difficulty in placing individual components into one dimension as they felt some could have related to several dimensions.

For the reasons expressed above, it should be noted that review findings should not be seen as a source of expert advice on process evaluation, but rather considered as a synthesis of current practice which can help reflect on the planning for process evaluation in the context of co-creation. Furthermore, while almost all the co-authors found most process evaluation components to be applicable and relevant to both stages, some shared the challenge of thinking of the components without categorising them into the (a) co-creation process and (b) implementation of co-created solution/intervention. For the development of the process evaluation framework for co-creation planned as a follow-up study, although we anticipate some overlaps, we will explicitly refer to these two stages distinctively.

Lastly, it should be noted that authors used their discretion to determine inclusion or exclusion, based on their own judgement and consensus between reviewers. Hence, the decision on whether studies complied with the set definition of co-creation reviewers on the reviewers’ own perceptions. Reviewers included studies if they perceived them as complying with the definition of co-creation, which was based on the reviewers’ own perceptions. Any inconsistencies were discussed with the involvement of a third reviewer and, if needed, discussed with a broader group of reviewers for alignment.

Conclusion

This study offers an overview of process evaluation frameworks and components reported in studies conducting process evaluation of co-creation in public health. Results show a pluralistic understanding of process evaluation, which varies according to authors and refers to process evaluation concepts related to intervention implementation, outcome evaluation, mechanisms of impact, context and the co-creation process.

Alongside standard process evaluation components that relate to the intervention’s implementation, attention has been placed, by authors of included studies, on process evaluation components related to participation, context, experience of co-creators, together with impact, satisfaction and fidelity. The study, overall, encourages the adoption of a holistic perspective to process evaluation, encompassing elements that allow for an enriched understanding of the process and for a comprehensive evaluation and replication of effective and meaningful interventions. By highlighting important gaps in the field, the findings also serve to inform future methodological work and guidance development on process evaluation and can be used as guidance when planning for process evaluation.

  • Handling editor: Valery Ridde

  • X: @giulianalongi

  • Contributors: GRL and MG-G developed the study concept, to which all authors provided critical feedback. Title and abstract screening were performed by GRL, SC, JdB, DMA, LM and KG and conflicts were resolved by GRL, TA, QA and MV. Full-text screening was performed by GRL, DMA, JdB, MV, TA, QA, KG and MG-G and conflicts resolved by GRL, KG, MV, DMA, JdB, MV, MG-G, QA and TA. Data extraction was conducted by GRL, TA, MV, KG, JdB, QA, DMA and JRZR. The first draft of the manuscript was prepared by GRL. The first round of the concept-mapping exercise was performed by GRL, AD, MG-G, TA and MV while the second was performed, in-person, by MG-G, DMA, MV, LM, JdB, KG and BD and LD. The former two, not listed as co-authors, have been thanked for participating in the acknowledgements. All authors contributed to the article and approved the submitted version. GRL will be acting as the as the guarantor for this study.

  • Funding: This study has been funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skłodowska-Curie grant agreement no 956501.

  • Disclaimer: The views expressed in this paper are the author’s views and do not necessarily reflect those of the funders.

  • Competing interests: None declared.

  • Patient and public involvement: Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

  • Provenance and peer review: Not commissioned; externally peer reviewed.

  • Supplemental material: This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

Data availability statement

All data relevant to the study are included in the article or uploaded as online supplemental information.

Ethics statements

Patient consent for publication:

Acknowledgements

Benedicte Deforche and Lea Delfmann for contributing to the concept mapping exercise.

  1. close Agnello DM, Loisel QEA, An Q, et al. Establishing a health CASCADE–curated open-access database to consolidate knowledge about co-creation: novel artificial intelligence–assisted methodology based on systematic reviews. J Med Internet Res 2023; 25.
  2. close Vargas C, Whelan J, Brimblecombe J, et al. Co-creation, co-design, co-production for public health - a perspective on definition and distinctions. Public Health Res Pract 2022; 32.
  3. close Longworth GR, Erikowa-Orighoye O, Anieto EM, et al. Conducting co-creation for public health in low and middle-income countries: a systematic review and key informant perspectives on implementation barriers and facilitators. Global Health 2024; 20:9.
  4. close Greenhalgh T, Jackson C, Shaw S, et al. Achieving research impact through co-creation in community-based health services. Milbank Q 2016; 94:392–429.
  5. close Halvorsrud K, Kucharska J, Adlington K, et al. Identifying evidence of effectiveness in the co-creation of research: a systematic review and meta-analysis of the International healthcare literature. J Public Health (Oxf) 2021; 43:197–208.
  6. close Oakley A, Strange V, Bonell C, et al. Process evaluation in randomised controlled trials of complex interventions. BMJ 2006; 332:413–6.
  7. close Lewin S, Glenton C, Oxman AD, et al. Use of qualitative methods alongside randomised controlled trials of complex healthcare interventions: methodological study. BMJ 2009; 339.
  8. close Saunders RP, Evans MH, Joshi P, et al. Developing a process-evaluation plan for assessing health promotion program implementation: a how-to guide. Health Promot Pract 2005; 6:134–47.
  9. close Moore GF, Audrey S, Barker M, et al. Process evaluation of complex interventions: medical research Council guidance. BMJ 2015; 350.
  10. close Moore G, Audrey S, Barker M, et al. Process evaluation in complex public health intervention studies: the need for guidance. J Epidemiol Community Health 2014; 68:101–2.
  11. close Ridde V, Pérez D, Robert E, et al. Using implementation science theories and frameworks in global health. BMJ Glob Health 2020; 5.
  12. close Ridde V. Need for more and better implementation science in global health. BMJ Glob Health 2016; 1.
  13. close Leask CF, Sandlund M, Skelton DA, et al. Framework, principles and recommendations for utilising participatory methodologies in the co-creation and evaluation of public health interventions. Res Involv Engagem 2019; 5.
  14. close Messiha K, Chinapaw MJM, Ket HCFF, et al. Systematic review of contemporary theories used for co-creation, co-design and co-production in public health. J Public Health (Oxf) 2023; 45:723–37.
  15. close Schelvis RMC, Wiezer NM, van der Beek AJ, et al. The effect of an organizational level participatory intervention in secondary vocational education on work-related health outcomes: results of a controlled trial. BMC Public Health 2017; 17.
  16. close Nielsen K. Review article: how can we make organizational interventions work? Employees and line managers as actively crafting interventions. Hum Relat 2013; 66:1029–50.
  17. close Lazo-Porras M, Perez-Leon S, Cardenas MK, et al. Lessons learned about Co-creation: developing a complex intervention in rural Peru. 2022;
  18. close van Dijk-de Vries A, Stevens A, van der Weijden T, et al. How to support a co-creative research approach in order to foster impact. The development of a co-creation impact compass for healthcare researchers. PLOS One 2020; 15.
  19. close Elwy AR, Wasan AD, Gillman AG, et al. Using formative evaluation methods to improve clinical implementation efforts: description and an example. Psychiatry Res 2020; 283:112532.
  20. close Murray E, May C, Mair F, et al. Development and formative evaluation of the e-health implementation Toolkit (E-HIT). BMC Med Inform Decis Mak 2010; 10.
  21. close Bauer MS, Damschroder L, Hagedorn H, et al. An introduction to implementation science for the non-specialist. BMC Psychol 2015; 3.
  22. close Cousins JB, Whitmore E. Framing participatory evaluation. New Dir Eval 1998; 1998:5–23.
  23. close Trochim WMK. An introduction to concept mapping for planning and evaluation. Eval Program Plan 1989; 12:1–16.
  24. close Tricco AC, Lillie E, Zarin W, et al. PRISMA extension for scoping reviews (PRISMA-SCR): checklist and explanation. Ann Intern Med 2018; 169:467–73.
  25. close Verloigne M, Altenburg T, Cardon G, et al. Making co-creation a trustworthy methodology for closing the implementation gap between knowledge and action in health promotion: the health CASCADE project. Zenodo 2022;
    Available: here
  26. close Levac D, Colquhoun H, O’Brien KK, et al. Scoping studies: advancing the methodology. Implement Sci 2010; 5:69.
  27. close Messiha K. Deliverable D1.1 ESR1 document: preliminary synthesis. 2021;
  28. close Thomson K, Bambra C, McNamara C, et al. The effects of public health policies on population health and health inequalities in European welfare States: protocol for an umbrella review. Syst Rev 2016; 5.
  29. close Njoku ET. Empirical research, Encyclopedia of Psychology and Religion. Berlin, Heidelberg, Springer 2017;
    Available: here
  30. close Frable PJ, Dart L, Bradley PJ, et al. Healthy weigh (El Camino Saludable) phase 1: a retrospective critical examination of program evaluation. Prev Chronic Dis 2006; 3.
  31. close Marinescu LG, Sharify D, Krieger J, et al. Be active together: supporting physical activity in public housing communities through women-only programs. Cpr 2013; 7:57–66.
  32. close Dixon-Ibarra A, Driver S, VanVolkenburg H, et al. Formative evaluation on a physical activity health promotion program for the group home setting. Eval Program Plann 2017; 60:81–90.
  33. close Harper GW, Contreras R, Bangi A, et al. Collaborative process evaluation: enhancing community relevance and cultural appropriateness in HIV prevention. J Prev Interv Community 2003; 26:53–69.
  34. close Morgan D, Kosteniuk J, O’Connell ME, et al. Barriers and Facilitators to development and implementation of a rural primary health care intervention for dementia: a process evaluation. BMC Health Serv Res 2019; 19.
  35. close Magnusson M, Hallmyr Lewis M, Smaga-Blom M, et al. Health equilibrium initiative: a public health intervention to narrow the health gap and promote a healthy weight in Swedish children. BMC Public Health 2014; 14.
  36. close Kteily-Hawa R, Hari S, Wong JP, et al. Development and implementation of peer leader training for community-based Participatory sexual health research. Prog Community Health Partnersh 2019; 13:303–19.
  37. close Kelly G, Wang S-Y, Lucas G, et al. Facilitating meaningful engagement on community advisory committees in patient-centered outcome research. Prog Community Health Partnersh 2017; 11:243–51.
  38. close Plumb M, Price W, Kavanaugh-Lynch MHE, et al. Funding community-based participatory research: lessons learned. J Interprof Care 2004; 18:428–39.
  39. close Gibbons MC, Illangasekare SL, Smith E, et al. A community health initiative: evaluation and early lessons learned. Prog Community Health Partnersh 2016; 10:89–101.
  40. close Greer AM, Amlani A, Pauly B, et al. Participant, peer and PEEP: considerations and strategies for involving people who have used illicit substances as assistants and advisors in research. BMC Public Health 2018; 18.
  41. close Keller AO, Berman R, Scotty B, et al. Exploring corporate stakeholders’ perspectives on building capacity for employee engagement in workplace wellness initiatives. J Patient Exp 2022; 9.
  42. close Geelen SJG, Giele BM, Nollet F, et al. Improving physical activity in adults admitted to a hospital with interventions developed and implemented through cocreation: protocol for a pre-post embedded mixed methods study. JMIR Res Protoc 2020; 9.
  43. close Fusari G, Gibbs E, Hoskin L, et al. Protocol for a feasibility study of ontrack: a digital system for upper limb rehabilitation after stroke. BMJ Open 2020; 10.
  44. close Brooks H, Lovell K, Bee P, et al. Implementing an intervention designed to enhance service user involvement in mental health care planning: a qualitative process evaluation. Soc Psychiatry Psychiatr Epidemiol 2019; 54:221–33.
  45. close Cameron J, Pidd K, Roche A, et al. A co-produced cultural approach to workplace alcohol interventions: barriers and facilitators. Drugs: Educ Prev Policy 2019; 26:401–11.
  46. close Gensby U, Braathen TN, Jensen C, et al. Designing a process evaluation to examine mechanisms of change in return to work outcomes following participation in occupational rehabilitation: a theory-driven and interactive research approach. Int J Disabil Manag 2018; 13.
  47. close Igel U, Gausche R, Lück M, et al. Challenges in doing multi-disciplinary health promotion research in Germany. Health Promot Int 2018; 33:1082–9.
  48. close Hinckson E, Schneider M, Winter SJ, et al. Citizen science applied to building healthier community environments: advancing the field through shared construct and measurement development. Int J Behav Nutr Phys Act 2017; 14.
  49. close Reeve J, Cooper L, Harrington S, et al. Developing, delivering and evaluating primary mental health care: the co-production of a new complex intervention. BMC Health Serv Res 2016; 16.
  50. close Sormunen M, Saaranen T, Tossavainen K, et al. Process evaluation of an elementary school health learning intervention in Finland. Health Educ 2012; 112:272–91.
  51. close Yeary KHK, Cornell CE, Prewitt E, et al. The WORD (wholeness, oneness, righteousness, deliverance): design of a randomized controlled trial testing the effectiveness of an evidence-based weight loss and maintenance intervention translated for a faith-based, rural, African American population using a community-based participatory approach. Contemp Clin Trials 2015; 40:63–73.
  52. close Robertson S, Woodall J, Henry H, et al. Evaluating a community-led project for improving fathers’ and children’s wellbeing in England. Health Promot Int 2018; 33:410–21.
  53. close Anselma M, Altenburg T, Chinapaw M, et al. Kids in action: the protocol of a youth participatory action research project to promote physical activity and dietary behaviour. BMJ Open 2019; 9.
  54. close McMaughan DJ, Ozmetin JP, Welch ML, et al. Framing the front door: co-creating a home health care assessment of service need for children with disabilities. Home Health Care Serv Q 2021; 40:231–46.
  55. close Cedstrand E, Nyberg A, Bodin T, et al. Study protocol of a co-created primary organizational-level intervention with the aim to improve organizational and social working conditions and decrease stress within the construction industry – a controlled trial. BMC Public Health 2020; 20.
  56. close Robertson S, Carroll P, Donohoe A, et al. The environment was like they were in the pub but with no alcohol’ - a process evaluation of engagement and sustainability in men on the move an Irish community-based physical activity intervention. Int J Mens Soc Community Health 2018; 1:e1–14.
  57. close Hassenforder E, Ducrot R, Ferrand N, et al. Four challenges in selecting and implementing methods to monitor and evaluate participatory processes: example from the Rwenzori region, Uganda. J Environ Manage 2016; 180:504–16.
  58. close Tolma EL, Cheney MK, Troup P, et al. Design the process evaluation for the collaborative planning of a local turning point partnership. Health Promot Pract 2009; 10:537–48.
  59. close Dyer J, Stringer LC, Dougill AJ, et al. Assessing participatory practices in community-based natural resource management: experiences in community engagement from Southern Africa. J Environ Manage 2014; 137:137–45.
  60. close Chrisman NJ, Senturia K, Tang G, et al. Qualitative process evaluation of urban community work: a preliminary view. Health Educ Behav 2002; 29:232–48.
  61. close Parker EA, Chung LK, Israel BA, et al. Community organizing network for environmental health: using a community health development approach to increase community capacity around reduction of environmental triggers. 2003;
  62. close Linnan, L. Process Evaluation for Public Health Interventions and Research. US, Jossey-Bass/Wiley 2002; xxviii.
  63. close Schelvis RMC, Wiezer NM, Blatter BM, et al. Evaluating the implementation process of a participatory organizational level occupational health intervention in schools. BMC Public Health 2016; 16.
  64. close Anselma M, Altenburg TM, Emke H, et al. Co-designing obesity prevention interventions together with children: intervention mapping meets youth-led participatory action research. Int J Behav Nutr Phys Act 2019; 16.
  65. close Palmer VJ, Chondros P, Piper D, et al. The CORE study protocol: a stepped wedge cluster randomised controlled trial to test a co-design technique to optimise psychosocial recovery outcomes for people affected by mental illness in the community mental health setting. BMJ Open 2015; 5.
  66. close Glasgow RE, Vogt TM, Boles SM, et al. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health 1999; 89:1322–7.
  67. close Dean M, O’Kane C, Issartel J, et al. Cook like a boss: an effective co-created multidisciplinary approach to improving children’s cooking competence. Appetite 2022; 168.
  68. close Wilcox S, Laken M, Parrott AW, et al. The faith, activity, and nutrition (FAN) program: design of a participatory research intervention to increase physical activity and improve dietary habits in African American churches. Contemp Clin Trials 2010; 31:323–35.
  69. close Nielsen K, Randall R. Opening the black box: presenting a model for evaluating organizational-level interventions. Eur J Work Organ Psychol 2013; 22:601–17.
  70. close Lelie L, van der Molen HF, van den Berge M, et al. The process evaluation of a citizen science approach to design and implement workplace health promotion programs. BMC Public Health 2022; 22.
  71. close Greer AM, Luchenski SA, Amlani AA, et al. Peer engagement in harm reduction strategies and services: a critical case study and evaluation framework from British Columbia. BMC Public Health 2016; 16.
  72. close Damschroder LJ, Reardon CM, Widerquist MAO, et al. The updated consolidated framework for implementation research based on user feedback. Implement Sci 2022; 17.
  73. close Gupta N, Wåhlin-Jacobsen CD, Henriksen LN, et al. A participatory physical and psychosocial intervention for balancing the demands and resources among industrial workers (PIPPI): study protocol of a cluster-randomized controlled trial. BMC Public Health 2015; 15.
  74. close Nielsen K, Abildgaard JS. Organizational interventions: a research-based framework for the evaluation of both process and effects. Work Stress 2013; 27:278–97.
  75. close Rowe G, Frewer LJ. Public participation methods: a framework for evaluation. Sci Technol Hum Values 2000; 25:3–29.
  76. close Beckerman-Hsu JP, Aftosmes-Tobio A, Gavarkovs A, et al. Communities for healthy living (CHL) a community-based intervention to prevent obesity in low-income preschool children: process evaluation protocol. Trials 2020; 21.
  77. close Grant A, Treweek S, Dreischulte T, et al. Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting. Trials 2013; 14.
  78. close Kelly K, Van Vlaenderen H. Evaluating participation processes in community development. Eval Program Plan 1995; 18:371–83.
  79. close Hewitt G, Draper AK, Ismail S, et al. Using Participatory approaches with older people in a residential home in Guyana: challenges and tensions. J Cross Cult Gerontol 2013; 28:1–25.
  80. close De Rosis S, Pennucci F, Noto G, et al. Healthy living and co-production: evaluation of processes and outcomes of a health promotion initiative co-produced with adolescents. IJERPH 2020; 17:8007.
  81. close Poland BD, Tupker E, Breland K, et al. Involving street youth in peer harm reduction education. The challenges of evaluation. Can J Public Health 2002; 93:344–8.
  82. close van der Ham AJ, van Erp N, Broerse JEW, et al. Monitoring and evaluation of patient involvement in clinical practice guideline development: lessons from the multidisciplinary guideline for employment and severe mental illness, the Netherlands. Health Expect 2016; 19:471–82.
  83. close Reininger BM, Barroso CS, Mitchell-Bennett L, et al. Process evaluation and participatory methods in an obesity-prevention media campaign for Mexican Americans. Health Promot Pract 2010; 11:347–57.
  84. close Verloigne M, Altenburg TM, Chinapaw MJM, et al. Using a co-creational approach to develop, implement and evaluate an intervention to promote physical activity in adolescent girls from vocational and technical schools: a case control study. Int J Environ Res Public Health 2017; 14.
  85. close Kobeissi L, Nakkash R, Ghantous Z, et al. Evaluating a community based participatory approach to research with disadvantaged women in the Southern suburbs of Beirut. J Community Health 2011; 36:741–7.
  86. close Clark J, Laing K. Co-production with young people to tackle alcohol misuse. DAT 2018; 18:17–27.
  87. close den Broeder L, Chung KY, Geelen L, et al. We are all experts! does stakeholder engagement in health impact scoping lead to consensus? A Dutch case study. Impact Assess Proj Apprais 2016; 34:294–305.
  88. close Cramer ME, Lazoritz S, Shaffer K, et al. Community advisory board members’ perspectives regarding opportunities and challenges of research collaboration. West J Nurs Res 2018; 40:1032–48.
  89. close Basu Roy U, Michel T, Carpenter A, et al. Community-led cancer action councils in Queens, New York: process evaluation of an innovative partnership with the queens library system. Prev Chronic Dis 2014; 11:130176.
  90. close Dennehy R, Cronin M, Arensman E, et al. Involving young people in cyberbullying research: the implementation and evaluation of a rights‐based approach. Health Expect 2019; 22:54–64.
  91. close Lundy L. Voice’ is not enough: conceptualising article 12 of the United Nations convention on the rights of the child. British Educational Res J 2007; 33:927–42.
  92. close Schelvis RMC, Oude Hengel KM, Wiezer NM, et al. Design of the bottom-up innovation project - a participatory, primary preventive, organizational level intervention on work-related stress and well-being for workers in Dutch vocational education. BMC Public Health 2013; 13.
  93. close Andrews JO, Cox MJ, Newman SD, et al. Training partnership dyads for community-based participatory research: strategies and lessons learned from the community engaged scholars program. Health Promot Pract 2013; 14:524–33.
  94. close Hetherington E, Eggers M, Wamoyi J, et al. Participatory science and innovation for improved sanitation and hygiene: process and outcome evaluation of project SHINE, a school-based intervention in rural Tanzania. BMC Public Health 2017; 17.
  95. close Sylvain C, Durand MJ, Velasquez Sanchez A, et al. Development and implementation of a mental health work rehabilitation program: results of a developmental evaluation. J Occup Rehabil 2019; 29:303–14.
  96. close Brussoni M, Olsen LL, Joshi P, et al. Aboriginal community-centered injury surveillance: a community-based Participatory process evaluation. Prev Sci 2012; 13:107–17.
  97. close Falletta KA, Srinivasulu S, Almonte Y, et al. Building community capacity for qualitative research to improve pregnancy intention screening. Prog Community Health Partnersh 2019; 13:411–26.
  98. close Heggdal K, Mendelsohn JB, Stepanian N, et al. Health‐care professionals’ assessment of a person‐centred intervention to empower self‐management and health across chronic illness: qualitative findings from a process evaluation study. Health Expect 2021; 24:1367–77.
  99. close Svartengren M, Hellman T. Study protocol of an effect and process evaluation of the stamina model; a structured and time-effective approach through methods for an inclusive and active working life. BMC Public Health 2018; 18.
  100. close Sharpe PA, Flint S, Burroughs-Girardi EL, et al. Building capacity in disadvantaged communities: development of the community advocacy and leadership program. Prog Community Health Partnersh 2015; 9:113–27.
  101. close de Jong MAJG, Wagemakers A, Koelen MA, et al. Study protocol: evaluation of a community health promotion program in a socioeconomically deprived city district in the Netherlands using mixed methods and guided by action research. BMC Public Health 2019; 19.
  102. close Berge JM, Jin SW, Hanson C, et al. Play it forward! a community-based participatory research approach to childhood obesity prevention. Fam Syst Health 2016; 34:15–30.
  103. close Bauermeister JA, Pingel ES, Sirdenis TK, et al. Ensuring community participation during program planning: lessons learned during the development of a HIV/ STI program for young sexual and gender minorities. Am J Community Psychol 2017; 60:215–28.
  104. close Elinder LS, Heinemans N, Hagberg J, et al. A Participatory and capacity-building approach to healthy eating and physical Activity- SCIP-school: a 2-year controlled trial. Int J Behav Nutr Phys Act 2012; 9:145.
  105. close Lazo-Porras M, Liu H, Ouyang M, et al. Process evaluation of complex interventions in non-communicable and neglected tropical diseases in low- and middle-income countries: a scoping review. BMJ Open 2022; 12.
  106. close Liu H, Mohammed A, Shanthosh J, et al. Process evaluations of primary care interventions addressing chronic disease: a systematic review. BMJ Open 2019; 9.
  107. close Muvuka B, Combs RM, Ali NM, et al. Depression is real: developing a health communication campaign in an urban African American community. Prog Community Health Partnersh 2020; 14:161–72.
  108. close Shahmanesh M, Okesola N, Chimbindi N, et al. Thetha Nami: participatory development of a peer-navigator intervention to deliver biosocial HIV prevention for adolescents and youth in rural South Africa. BMC Public Health 2021; 21.
  109. close Stetler CB, Legro MW, Wallace CM, et al. The role of formative evaluation in implementation research and the QUERI experience. J Gen Intern Med 2006; 21 Suppl 2:S1–8.

  • Received: 8 November 2023
  • Accepted: 13 May 2024
  • First published: 4 July 2024