Table 2

Summary of the methodology used and quality assessment of the studies

Stage of process evaluationMethodology and methodsAnalysisQuality criteria
Feasibility/Piloting, 20 studies.9 studies used theories or frameworks. 18 used interviews. 3 used focus group discussions, 4 used questionnaires or surveys, 2 studies used routine monitoring data, field notes, minutes of meetings and observations.Thematic analysis, constant comparative approach most commonly used, with some using framework analysis.Planning:
Team description: 11 Y, 6 N, 3 NA.
Design and conduct:
Purpose: 20 Y.
Intervention description and causal assumptions clarified:
5 Y, 6 unclear, 9 NA, 0 N.
Justify choice of timing and methods: 19 Y, 1 N.
COREQ covered out of the 3 domains (17 applicable studies):
3 domains: 11.
2 domains: 3.
1 domain: 3.
Reporting:
Clearly labelled as process evaluations: 5.
Protocol/full report: 8.
Evaluation of effectiveness, 43 studies.12 studies used existing theories and frameworks (6 classic theories,
3 evaluation frameworks, 3 implementation theories).
2000–2004: 3 studies documented specific processes of care as part of the process evaluation, which were reported as part of the main trial. 4 studies investigated acceptability of an intervention using surveys/questionnaires.
2005 onwards: 12 studies used interviews alone to explore implementation and acceptability; 20 studies used interviews triangulated with other sources of data (eg, chart audit). 2 studies used routine administrative data to indicate fidelity. 3 studies used questionnaires or surveys.
Descriptive statistics were used for the quantitative data, and thematic, constant comparison and framework analysis for the qualitative data.
The studies that used mixed methods used the quantitative data to indicate level of implementation, reach and the dose. This was used to triangulate the qualitative findings on implementation and intervention acceptability. The studies which used evaluation frameworks (eg, RE-AIM) and implementation theories (eg, NPT) used them for the analysis and presentation.
Planning:
Team description: 21 Y, 21 N, 1 NA.
Design and conduct:
Purpose: 43.
Intervention description and causal assumptions clarified: 25 Y, 8 unclear, 5 NA, 5 N.
Justify choice of timing and methods: 40 Y, 1 N, 2 NA.
Report whether the process data are analysed blind to trial outcomes/or post-hoc: 29 Y, 7 N, 7 NA.
COREQ covered out of the 3 domains (30 applicable studies):
3 domains: 12.
2 domains: 13.
1 domain: 5.
Reporting:
Clearly labelled as process evaluations: 17 (of note: 2 before 2008, 6 until 2015 and 9 after 2015).
Protocol/full report: 21.
Post-evaluation, 6 studies1 study used existing theory. 2 studies used interviews, 2 used documentary analysis, and 1 used the administrative data and registry data.Descriptive statistics, subgroup analysis and thematic analysis.Planning:
Team description: 3 Y, 2 N, 1 NA.
Design and conduct:
Purpose: 6
Intervention description and causal assumptions clarified: 0 Y, 2 unclear, 2 NA, 2 N.
Justify choice of timing and methods: 5 Y, 1 N.
COREQ covered out of the 3 domains (3 applicable studies):
3 domains: 1.
2 domains: 1.
1 domain: 1.
Reporting:
Clearly labelled: 0.
Protocol/full report: 1.
  • COREQ, Consolidated criteria for Reporting Qualitative research; N, no; NA, not applicable; NPT, normalisation process theory; RE-AIM, reach, efficacy/effectiveness, adoption, implementation and maintenance framework; Y, yes.