Original ArticleThe Oxford Implementation Index: a new tool for incorporating implementation data into systematic reviews and meta-analyses
Introduction
Evidence-based practice encourages clinicians to look at systematic reviews of specific interventions, such as those produced and maintained by the Cochrane Collaboration, as the “gold standard” for measuring effects. Rigorous systematic reviews and meta-analyses are designed to minimize bias, efficiently distil large amounts of information, and provide information of value to clinicians [1]. However, systematic reviews and meta-analyses of medical or psychosocial interventions present methodological challenges [2]. Reviewers must exercise subjective judgment when deciding whether a statistical meta-analysis will be a reliable summary and when describing how results may be applied to clinical practice. Reviewers are trained to appraise many sources of variation across trials, particularly characteristics related to the methodological quality of included trials. Systematic review guidance, such as the Cochrane Handbook and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) statement, provide thorough instructions for the identification, extraction, and appraisal of information on methodological quality. These guidance documents, however, place less emphasis on intervention implementation, which can be an important source of variation across primary trials. Implementation encompasses information related to intervention design, delivery, and uptake in primary trials. To date, systematic review guidance has not provided clear direction for reviewers seeking to compare implementation across trials.
Section snippets
Distinguishing between fidelity and implementation
Individual trials use various terms for implementation fidelity, defined as the degree to which interventions are implemented according to the design. These terms include adherence, treatment fidelity, treatment integrity, program integrity, and implementation quality. As used in primary trials, these definitions of fidelity are intended to capture practitioners' and participants' compliance with intervention protocols. The rigorous assessment of implementation fidelity can provide many
Methods
The Oxford Implementation Index was developed by a team of systematic reviewers at the Centre for Evidence-Based Intervention at Oxford University. Team members are affiliated with the Cochrane and Campbell Collaborations and independently conduct randomized controlled trials in areas of public health, psychiatry, and social welfare. A steering committee was established, and the index development began in 2005.
Domains of the Oxford Implementation Index
Each implementation domain encompasses a number of implementation characteristics, which may vary across trials assessing similar interventions. The index highlights many of these characteristics, but reviewers should decide which aspects are most relevant to their topic areas. Broadly, the final domains are intervention design, the actual delivery by trial clinicians, the uptake of the intervention by participants, and contextual factors.
Discussion
The Oxford Implementation Index fills an important gap in current guidelines for conducting systematic reviews and meta-analyses. It provides an explicit and systematic method of assessing implementation data across trials, which can help reviewers to combine trials appropriately, explain heterogeneity within reviews, and critically appraise the generalizability of reviews to clinical practice. By encouraging reviewers to focus more explicitly and carefully on implementation data, we hope that
Acknowledgments
The authors thank the late Leonard Gibbs, Janet Harris, and graduate students in the Oxford Evidence-Based Intervention course for helping to refine the index in its early stages. They are grateful to Isabelle Boutron, Jane Dennis, Janet Harris, Alan Kazdin, Steve Milan, Francheska Perepletchikova, Prathap Tharyan, and Denise Thomson for their comments on this article. They also would particularly like to thank Sean Grant for his help in the latter stages of preparing this article.
References (92)
- et al.
Program integrity in primary and early secondary prevention: are implementation effects out of control?
Clin Psychol Rev
(1998) - et al.
Fair tests of clinical trials: a treatment implementation model
Adv Behav Res Ther
(1994) - et al.
Treatment fidelity in outcome studies
Clin Psychol Rev
(1991) - et al.
The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomised trials
Lancet
(2001) - et al.
CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials
J Clin Epidemiol
(2010) - et al.
Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statement
Lancet
(1999) - et al.
GRADE guidelines: 8. Rating the quality of evidence—indirectness
J Clin Epidemiol
(2011) - et al.
Assessing medication adherence by pill count and electronic monitoring in the African American Study of Kidney Disease and hypertension (AASK) pilot study
Am J Hypertens
(1996) - et al.
Summing up evidence: one answer is not always enough
Lancet
(1998) - et al.
The impact of the CONSORT statement on reporting of randomized clinical trials in psychiatry
Contemp Clin Trials
(2009)
Improving the quality of reporting alcohol outcome studies: effects of the CONSORT statement
Addict Behav
Trying to do more good than harm in policy and practice: the role of rigorous, transparent, up-to-date, replicable evaluations
Ann Am Acad Polit Soc Sci
The challenges of systematically reviewing public health interventions
J Public Health
Definitional and practical issues in the assessment of treatment integrity
Clin Psychol Sci Prac
A review of research on fidelity of implementation: implications for drug abuse in school settings
Health Educ Res
Treatment integrity: implications for training
Clin Psychol Sci Prac
Evaluating fidelity: predictive validity for a measure of competent adherence to the Oregon model of parent management training
Behav Ther
Therapists, therapist variables, and cognitive-behavioral therapy outcome in a multicenter trial for panic disorder
J Consult Clin Psychol
Making evidence based interventions work
Treatment integrity and therapeutic change: issues and research recommendations
Clin Psychol Sci Pract
Multisystemic therapy: monitoring treatment fidelity
Fam Process
Testing the integrity of a psychotherapy protocol: assessment of adherence and competence
J Consult Clin Psychol
A conceptual framework for implementation fidelity
Implement Sci
Implementation research: a synthesis of the literature
Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from the NIH Behavior Change Consortium
Health Psychol
A new tool to assess treatment fidelity and evaluation of treatment fidelity across 10 years of health behavior research
J Consult Clin Psychol
Examples of implementation and evaluation of treatment fidelity in the BCC studies: where we are and where we need to go
Ann Behav Med
Treatment integrity in applied behavior analysis with children
J Appl Behav Anal
Treatment integrity of school-based behavioral intervention studies: 1980-1990
Sch Psychol Rev
The integrity of independent variables in behavior analysis
J Appl Behav Anal
CONSORT 2010 Explanation and Elaboration: updated guidelines for reporting parallel group randomised trials
J Clin Epidemiol
Reporting of noninferiority and equivalence randomized trials: an extension of the CONSORT statement
JAMA
CONSORT statement: extension to cluster randomised trials
BMJ
Better reporting of harms in randomized trials: an extension of the CONSORT statement
Ann Intern Med
Evidence-based behavioral medicine: what is it and how do we achieve it?
Ann Behav Med
Extending the CONSORT statement to randomized trials of nonpharmacologic treatment: explanation and elaboration
Ann Intern Med
Consort 2010 statement: extension to cluster randomised trials
BMJ
Improving the reporting of pragmatic trials: an extension of the CONSORT statement
BMJ
Reporting randomized, controlled trials of herbal interventions: an elaborated CONSORT statement
Ann Intern Med
CONSORT for reporting randomized controlled trials in journal and conference abstracts: explanation and elaboration
PLoS Med
The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration
J Clin Epidemiol
Cited by (66)
Community-Based Exercise Programs for Cancer Survivors: A Scoping Review of Program Characteristics Using the Consolidated Framework for Implementation Research
2022, Archives of Physical Medicine and RehabilitationCitation Excerpt :Any notable discrepancies were resolved through discussion and consensus. The following data were extracted from each eligible study using the Oxford Implementation Index, a previously developed tool designed to guide the extraction of implementation data from primary studies.23 Specifically, data related to (1) study identification (ie, country, setting, author names, institution); (2) study design; (3) intervention characteristics (ie, exercise type, frequency, program duration, delivery format, equipment/space requirements, fee structure, cost for participants); (4) characteristics of the participant population (ie, sociodemographic information, cancer type, cancer stage, treatment status), program staff characteristics (ie, training), and program delivery; and (5) contextual factors (ie, geographic location, organizational characteristics, contact with health care professionals).
Wearable activity trackers for promoting physical activity: A systematic meta-analytic review
2021, International Journal of Medical InformaticsA systematic review of the effectiveness of interventions designed for mothers who experienced child sexual abuse
2020, Child Abuse and NeglectCitation Excerpt :BL and ABM independently completed data extraction for all included studies. The Oxford Implementation Index was used to determine what implementation data to extract from studies (Montgomery, Underhill, Gardner, Operario, & Mayo-Wilson, 2013). Although the expectation was that the most common intervention study outcomes would pertain to parenting and mental health, we extracted data related to any primary outcomes described by the study authors.
Mapping of reporting guidance for systematic reviews and meta-analyses generated a comprehensive item bank for future reporting guidelines
2020, Journal of Clinical Epidemiology
Conflict of interest: The Oxford Implementation Index was developed by a team of systematic reviewers at the Centre for Evidence-Based Intervention at Oxford University. The team members are affiliated with the Cochrane and Campbell Collaborations and independently conduct randomized controlled trials in areas of public health, psychiatry, and social welfare. P.M. is the author of over a dozen Cochrane Reviews and was, until recently, cochair of the Campbell Collaboration Social Welfare Group. He led this team and managed the project. P.M., E.M.-W., D.O., F.G., and K.U. contributed to the conception of the index and initial project planning. A systematic literature search was developed and conducted by K.U. and P.M. The index was developed by all the authors, led by P.M. and K.U. Piloting was led by P.M. and facilitated by F.G., E.M.-W., D.O., and K.U. K.U., E.M.-W., and P.M. initially evaluated the index using their own reviews. The Delphi panel was run by P.M. and K.U.; all the authors discussed feedback and, led by P.M., refined the index. All the authors contributed to the writing and revision of the article, led by K.U., E.M.-W., and P.M. The index was the focus of a doctoral dissertation by K.U.
Funding: We acknowledge the support of the Oxford Department of Social Policy and Intervention and the Centre for Evidence-Based Intervention. This research was supported in part by infrastructure and resources provided by the Brown University Alcohol Research Center on HIV/AIDS (NIH/NIAAA P01AA019072) and the Lifespan/Tufts/Brown Center for AIDS Research (NIH/NIAID P30AI042853).
Ethical approval: This article was based entirely on secondary research. Ethical approval was not required.
Competing interests: All authors have completed the Unified Competing Interest form at www.icmje.org/coi_disclosure.pdf and declared no support from any organization for the submitted work; no financial relationships with any organizations that might have an interest in the submitted work in the previous 3 years; no other relationships or activities that could appear to have influenced the submitted work.