Article Text

Download PDFPDF

Reproducible research practices, openness and transparency in health economic evaluations: study protocol for a cross-sectional comparative analysis
  1. Ferrán Catalá-López1,2,3,
  2. Lisa Caulley3,4,5,6,
  3. Manuel Ridao7,
  4. Brian Hutton3,8,
  5. Don Husereau9,10,
  6. Michael F Drummond11,
  7. Adolfo Alonso-Arroyo12,13,
  8. Manuel Pardo-Fernández14,
  9. Enrique Bernal-Delgado7,
  10. Ricard Meneu15,
  11. Rafael Tabarés-Seisdedos2,
  12. José Ramón Repullo1,
  13. David Moher3,8
  1. 1 Department of Health Planning and Economics, National School of Public Health, Institute of Health Carlos III, Madrid, Spain
  2. 2 Department of Medicine, University of Valencia/INCLIVA Health Research Institute and CIBERSAM, Valencia, Spain
  3. 3 Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada
  4. 4 Otolaryngology-Head and Neck Surgery Department, Ottawa Hospital, Ottawa, Ontario, Canada
  5. 5 Department of Epidemiology, Erasmus University Medical Center, Rotterdam, The Netherlands
  6. 6 Ear, Nose and Throat Department, Guy's Hospital, London, UK
  7. 7 Instituto Aragonés de Ciencias de la Salud (IACS), Red de Investigación en Servicios de Salud en Enfermedades Crónicas (REDISSEC), Zaragoza, Spain
  8. 8 School of Epidemiology and Public Health, University of Ottawa, Ottawa, Ontario, Canada
  9. 9 Institute of Health Economics, Edmonton, Alberta, Canada
  10. 10 Faculty of Medicine, University of Ottawa, Ottawa, Ontario, Canada
  11. 11 Centre for Health Economics, University of York, York, UK
  12. 12 Department of History of Science and Documentation, University of Valencia, Valencia, Spain
  13. 13 Information and Social and Health Research Unit (UISYS), University of Valencia and Spanish National Research Council (CSIC), Valencia, Spain
  14. 14 Spanish Medicines and Healthcare Products Agency (AEMPS), Madrid, Spain
  15. 15 Fundación Instituto de Investigación en Servicios de Salud, Valencia, Spain
  1. Correspondence to Dr Ferrán Catalá-López; ferran_catala{at}outlook.com

Abstract

Introduction There has been a growing awareness of the need for rigorously and transparent reported health research, to ensure the reproducibility of studies by future researchers. Health economic evaluations, the comparative analysis of alternative interventions in terms of their costs and consequences, have been promoted as an important tool to inform decision-making. The objective of this study will be to investigate the extent to which articles of economic evaluations of healthcare interventions indexed in MEDLINE incorporate research practices that promote transparency, openness and reproducibility.

Methods and analysis This is the study protocol for a cross-sectional comparative analysis. We registered the study protocol within the Open Science Framework (osf.io/gzaxr). We will evaluate a random sample of 600 cost-effectiveness analysis publications, a specific form of health economic evaluations, indexed in MEDLINE during 2012 (n=200), 2019 (n=200) and 2022 (n=200). We will include published papers written in English reporting an incremental cost-effectiveness ratio in terms of costs per life years gained, quality-adjusted life years and/or disability-adjusted life years. Screening and selection of articles will be conducted by at least two researchers. Reproducible research practices, openness and transparency in each article will be extracted using a standardised data extraction form by multiple researchers, with a 33% random sample (n=200) extracted in duplicate. Information on general, methodological and reproducibility items will be reported, stratified by year, citation of the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement and journal. Risk ratios with 95% CIs will be calculated to represent changes in reporting between 2012–2019 and 2019–2022.

Ethics and dissemination Due to the nature of the proposed study, no ethical approval will be required. All data will be deposited in a cross-disciplinary public repository. It is anticipated the study findings could be relevant to a variety of audiences. Study findings will be disseminated at scientific conferences and published in peer-reviewed journals.

  • cost-effectiveness analysis
  • data sharing
  • methodology
  • quality
  • reporting
  • reproducibility
http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

View Full Text

Statistics from Altmetric.com

Strengths and limitations of this study

  • To our knowledge, this will be the first attempt to examine the extent to which health economic evaluations indexed in MEDLINE incorporate transparency, openness and reproducibility research practices.

  • We will be able to collect data on a broad cross-section of health economic evaluations and will not restrict inclusion based on the medical specialty, disease condition or healthcare intervention.

  • Study findings could be used to strengthen Open Science strategies and recommendations to increase the value of health economic evaluations.

  • The study may be limited by the inclusion of articles only catalogued in one database and written in English.

Introduction

In recent years, there has been a growing awareness of the need for rigorous and transparent reporting of health research to ensure that studies can be reproduced.1–7 The value of health research can be improved by increasing transparency and openness of the processes of research design, conduct, analysis and reporting.8 9 Sharing data and materials from health research studies has multiple positive effects within the research community: it is part of good publication practice in keeping the principles of Open Science; it allows for the conduct of additional analyses to further explore data and generate new hypotheses; it allows access to unpublished data and it encourages reproducibility in research.10 Recognising the potential impact of open research culture, journals are increasingly supporting the use of reporting guidelines, as well as policies and technologies that help to improve transparency.11–13 Scientists are increasingly encouraged to use reproducible research practices, which allow others to perform direct replication of studies using the same data and analytic methods.14 15 Furthermore, research funders are changing their grant requirements including open data sharing.16 17

Health economic evaluations, which compare alternative interventions or programmes in terms of their costs and consequences,18 can help inform resource allocation decisions. A cost-effectiveness analysis, a specific form of economic evaluation that compares alternative options in terms of their costs and their health outcomes, is a valuable tool in health technology assessment processes. Cost-effectiveness analyses haves been promoted as an important research methodology for assessing value for money of healthcare interventions and an important source of information for making clinical and policy decisions.19 Decisions about the use of new interventions in healthcare are often based on health economic evaluations. Efforts to increase transparent conduct and reporting of health economic evaluations have existed for many years.20–30 For example, the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement,30 first published in March 2013, provides recommendations for authors, peer reviewers and journal editors regarding how to prepare reports of health economic evaluations. The aim of CHEERS is to facilitate complete and transparent reporting of health economic evaluations and help more formal critical appraisal and interpretation. As a potential measure of impact,31 CHEERS has been cited over 1000 times in the Web of Science. However, little attention has been given to reproducibility practices such as sharing of study protocols, data and analytic methods (which allow others to recreate the study findings) as part of health economic evaluation studies.22–25 29

Previous research has evaluated the impact of economic evaluation guidelines and the reporting quality of published articles. For example, Jefferson et al 32 previously investigated whether publication (in August 1996) of the BMJ guidelines on peer review of economics submissions made any difference to editorial and peer review processes, quality of submitted manuscripts and quality of published manuscripts in two high-impact factor medical journals (BMJ and T he Lancet). In a sample of 105 articles on economics submissions, 27 (24.3%) were full health economic evaluations. Although Jefferson et al 32 were not studying reproducibility, openness and transparency directly, they did undertake an assessment of the impact of a reporting guideline for health economic evaluations. A 'before and after' assessment of implementation of the guideline was performed to assess how closely the reporting guidelines were followed. The authors found that the publication of the guidelines helped the editors improve the efficiency of the editorial process but had no impact on the reporting quality of health economic evaluations submitted or published.

The primary objective of this study will be to examine the extent to which articles of health economic evaluations of healthcare interventions indexed in Medline incorporate transparency, openness and reproducibility research practices. Secondary objectives will be to explore (1) how the reporting and reproducibility characteristics of health economic evaluations change between 2012 and 2022 and (2) whether the transparency and reproducibility practices have improved after the publication of the CHEERS statement in 2013.

Methods and analysis

This is the study protocol for a cross-sectional, comparative analysis. The present protocol has been registered within the Open Science Framework (registration identifier: osf.io/gzaxr). It is anticipated the study will be conducted during January 2020–December 2023.

Eligibility criteria

We will evaluate a random sample of 600 cost-effectiveness and cost-utility analyses of healthcare interventions, indexed in Medline during 2012 (n=200), 2019 (n=200) and 2022 (n=200), which focus on a healthcare intervention in humans and reports an incremental cost-effectiveness ratio in terms of costs per life years gained, quality-adjusted life years or disability-adjusted life years. In particular, this analysis will focus on full health economic evaluations that measures health effects in terms of prolongation of life and/or health-related quality of life. We will select this specific form of health economic evaluations because many decision-makers and researchers have recommended this framework as the standard reference for cost-effectiveness in health and medicine.19 Publications of health economic evaluations will be limited to journal articles written in English with an abstract available.

We will exclude editorials, letters, narrative reviews, systematic reviews, meta-analysis, methodological articles, retracted publications and health economic evaluations that do not quantify health impacts in terms of life years gained, quality-adjusted life years or disability-adjusted life years.

Searching

To provide a reliable summary of the literature, we will search Medline through PubMed (National Library of Medicine, Bethesda, Maryland, USA) for candidate studies throughout three cross-sectional, comparative time periods. First, we will search Medline-indexed articles in 2019 (‘reference year’) as it is the year closest to when the protocol for this study was drafted. In part two, we will search for articles indexed in 2012 and 2022, respectively, to further assess whether the transparency and reproducibility practices improved between 2012 (as it is 1 year before the publication of the CHEERS statement in 201330) and 2022 (10 years after). The literature searches will be conducted by an experienced information specialist. Our main literature search will be peer reviewed by a senior health information specialist using the Peer Review of Electronic Search Strategies checklist.33 The draft literature search strategy is based on a Medline search filter for economic evaluations34 and can be found in the online supplementary appendix 1.

Screening

All titles and abstracts will be screened using liberal acceleration (where two reviewers need to independently exclude a record while only one reviewer needs to include a record). We will retrieve the full text of any citations meeting our eligibility criteria or for which eligibility remains unclear. A form for screening full-text articles will be pilot tested on 50 articles. Subsequently, at least two reviewers will independently screen all full-text articles. Any discrepancies in screening full-text articles will be resolved via discussion or adjudication by a third reviewer if necessary.

Data extraction

If more than 600 health economic evaluations are identified in the search, we will perform data extraction on a random sample of articles stratified by publication year (200 in 2022, 2019 and 2012, respectively). If fewer than 200 articles are identified in a given year (eg, 2012), we will randomly select the sufficient number of studies published from the preceding year (eg, October–December 2011) to match the number used in the study sample. We will not perform any sample size calculations since our study will evaluate multiple indicators that are considered all equally important, and they may vary substantially in the proportion to which they are satisfied by the included articles. However, 200 articles per year was assumed to be sufficient to capture potential differences.

Data in each article will be extracted using a standardised data extraction form by multiple researchers, with a 33% random sample (n=200) extracted in duplicate. All data extractors will independently pilot test the form on 30 included studies to ensure consistency in interpretation of data items. Subsequently, data from each study will be independently extracted by one of several reviewers. Any discrepancies in the data extracted will be resolved via discussion or adjudication by a third researcher if necessary. Full articles and supplementary materials with data and analyses will be examined for general and methodological characteristics, statements of publicly available full protocols and data sets, conflicts of interest and funding disclosures. In particular, we will review the final versions of the articles available online.

The selection and wording of general, methodological and reproducibility indicators will be influenced by recommendations from relevant articles on research transparency and reproducibility.4 5 7 8 29 35–41 The standardised data extraction form will include the following:

General characteristics

  • Name of journal.

  • Journal impact factor (according to the latest Journal Citation Report at the time of data extraction).

  • Journal type (fully open access journal or subscription-based journal including those that may have open access content, eg, hybrid).

  • Year of publication.

  • Name, gender and country of corresponding author.

  • Type of condition addressed by the economic evaluation (International Statistical Classification of Diseases and Related Health Problems, 10th Revision category).

  • Type of interventions addressed (pharmacological, non-pharmacological, both) and the intervention to which it was compared (the ‘comparator’, eg, active alternative, usual care or placebo/do nothing) with adequate descriptions.40 41

  • Type of economic evaluation (single-study-based economic evaluation or model-based economic evaluation).

  • Study perspective (eg, society, healthcare system/provider) and relate this to the costs being evaluated.

  • Time horizon over which costs and outcomes are being evaluated.

  • Discount rate used for costs and outcomes with rationale (when applicable).

  • Health outcomes used as the measure of benefit (eg, life years gained, quality-adjusted life years or disability-adjusted life years) and their relevance for the type of analysis performed.

  • Measurement of effectiveness (eg, for single-study-based estimates: a description of the design features of the single effectiveness study and why the single study was a sufficient source of clinical effectiveness; and for synthesis-based estimates: a description of the methods used for identification of included studies and synthesis of clinical effectiveness data).

  • Estimate of resources and costs (including a description of approaches used to estimate resource use associated with the alternative interventions and describe methods for valuing each resource item in terms of its unit costs).

  • Discussion of all analytical methods supporting the evaluation (eg, methods for dealing with skewed, missing or censored data; extrapolation methods; methods for pooling data; methods for handling population heterogeneity and uncertainty such as subgroup analysis); choice of model and model calibration and validation (when applicable).

  • Results including number of incremental cost-effectiveness ratios (ICERs), sensitivity analyses, subgroup or heterogeneity analyses (eg, variations between subgroups of patients with different baseline characteristics or other variability in effects), incremental costs and outcomes for base case analysis ICERs (defined as a qualitative representation of the index ICER for example, ‘more costs, more outcomes’, ‘less costs, more outcomes’, ‘less costs, comparable outcomes’), the cost-effectiveness ratio values (defined as quantitative representation of the base case analysis ICER), incremental costs (the ratio’s numerator) and health effects (life years gained, quality-adjusted life years or both—the denominator of the ratio for base case analysis).

  • Conclusions including favourable if the intervention clearly claims to be the preferred choice (eg, cited as ‘cost-effective’, ‘reduced costs’, ‘produced cost savings’‘ ‘an affordable option’, ‘value for money’), unfavourable if the final comments are negative (eg, the intervention is ‘unlikely to be cost-effective’, ‘produced higher costs’, ‘is economically unattractive’ or ‘exceeded conventional thresholds of willingness to pay’) and neutral or uncertain when the intervention of interest do not surpass the comparator and/or when some uncertainty is expressed in the conclusions.

  • Funding (eg, no statement, no funding, public, private, other, combination of public/private/other).

  • Conflicts of interests (eg, no statement, statement no conflicts exist, statement conflicts exist).

Enablers for reproducibility, transparency and openness

  • Citation and/or mention of CHEERS statement (eg, no citation/mention, citation/mention without reporting checklist, citation/mention with reporting checklist).

  • Use of CHEERS appropriately (eg, when CHEERS was used as a reporting guideline to ensure a clear report of the study’s design, conduct and findings), inappropriately (eg, when CHEERS was used as a methodological tool to design or conduct health economic evaluations or as an assessment tool of methodological quality of publications reporting cost-effectiveness research) or in an unclear or neutral manner (eg, when use was neither appropriate nor inappropriate).31 42

  • Open access or free availability in PubMed Centralbased on assignment of a specific identifier (yes, no).

  • Protocol/registration mentioned (eg, no protocol, full protocol publicly available, full protocol publicly available and preregistered).

  • Health economics analysis plan mentioned (eg, no analysis plan, indicated that analysis plan was available on request, full access to analysis plan along with research protocol).39

  • Mention of raw data availability (eg, no data sharing, indicated that raw data were available on request, full access to raw data for reanalysis).

  • Mention of access to analytic methods and algorithms (eg, ‘code’, ‘script’, ‘model’) used to perform analyses (eg, no access, indicated that analytic methods were available on request, full access to analytic methods for reanalysis).

  • Type of data repository used, if appropriate including use of an open globally scoped repository (eg, Open Science Framework, Dryad, Mendeley, Zenodo), a journal repository (eg, additional file as webappendix or data paper) or other repository (eg, repository from a specific institution, project or nation).

  • Data made available to recreate the index ICERs (base case).

  • Data made available to recreate all core ICERs (base case and heterogeneity analysis).

  • Data made available to recreate all ICERs (base case, heterogeneity analysis and uncertainty analysis) according to reporting standards.30 38

  • Results have undergone rigorous independent replication and reproducibility checks (eg, whether the study claimed to be a replication effort in the abstracts and introductions)4 5: statement of novel findings (eg, the cost-effectiveness analysis claims that it presents some novel findings), statement of replication (eg, the cost-effectiveness analysis clearly claims that it is a replication effort trying to validate previous knowledge or it is inferred that the cost-effectiveness is a replication trying to validate previous knowledge), statement of novel findings and replication (eg, the cost-effectiveness analysis claims to be both novel and to replicate previous findings), no statement on novelty or replication (eg, no statement or an unclear statement about whether the cost-effectiveness analysis presents a novel finding or replication).

Data analysis

The analysis will be descriptive, with data summarised as frequency for categorical items or median and IQR for continuous items. We will characterise the indicators for the period 2012–2022. The proportion of general, methodological and reproducibility indicators stratified by year will be reported, as well as citation use of the CHEERS statement and journal (eg, according to whether it is an original CHEERS endorsed journal or not). The draft list of original CHEERS endorsed journals can be found in the online supplementary appendix 2. A priori established Fisher’s exact tests and risk ratios with 95% CIs will be calculated to represent changes in reporting between 2012–2019 and 2019–2022. We will explore whether reproducible research practices are associated with the citation of the CHEERS statement. We will apply the p value <0.005 threshold for statistical significance with p values 0.05–0.005 suggestive.5 43 44

All analyses will be performed using Stata V.16 or higher (StataCorp LP).

Updates and additional analyses

We plan to conduct a continual surveillance of the health economic literature, keeping evidence as up-to-date as possible. Iterations of the searches and review process will be repeated at regular intervals (eg, 3-year intervals after 2022) to continue to present timely and accurate findings. Reanalysis of the proposed reproducibility and transparency metrics and indicators may offer insight into progressive improvements in design, conduct and analysis of health economic evaluations over time.

Any (new) additional analysis examining potential associations between general characteristics from extracted studies (eg, results including index ICER or funding source) and enablers of reproducibility, transparency and openness (eg, mention of CHEERS statement, open access, protocol registration or mention of raw data) will be prospectively reported in a new specific (substudy) protocol, following standard methods described in this paper.

Patient and public involvement

No patients and/or public were involved in setting the research question nor they were involved in developing plans for design (or implementation) of this study protocol.

Ethics and dissemination

To the best of our knowledge, this cross-sectional analysis will be the first attempt to investigate the extent to which articles of cost-effectiveness of healthcare interventions incorporate transparent, open and reproducible research practices. Without complete and transparent reporting of how a health economic evaluation is being designed and conducted, it is difficult for readers and potential knowledge users to assess its conduct and validity. Strengthening the reproducibility, openness and reporting of methods and results can maximise the impact of health economic evaluations by allowing more accurate interpretation and use of their findings. We anticipate the study could be relevant to a variety of audiences including journal editors, peer reviewers, research authors, health technology assessment agencies, guideline developers, research funders, educators and other potential key stakeholders. Moreover, the study findings could further be used in discussions to strengthen Open Science to increase value and reduce waste from incomplete or unusable reports of health economic evaluations.

Any amendments made to this protocol when conducting the analyses will be outlined and reported in the final manuscript. Once completed, findings from this study will be published in peer-reviewed journals. All data underlying the findings reported in the final manuscript will be deposited in a cross-disciplinary public repository, such as the Open Science Framework (https://osf.io/). In addition, when new data have become available, we will update the analysis and present the updated findings at a public repository (and we may also seek publication in a peer-reviewed journal).

References

View Abstract

Footnotes

  • Twitter @donhusereau, @chromosome8, @repunomada, @dmoher

  • Contributors All authors contributed to conceptualising and designing the study. FC-L drafted the manuscript. LC, MR, BH, DH, MFD, AA-A, MP-F, EB-D, RM, RT-S, JRR and DM commented for important intellectual content and made revisions. All authors read and approved the final version of the manuscript. FC-L accepts full responsibility for the finished manuscript and controlled the decision to publish.

  • Funding FC-L and RT-S are supported by the Institute of Health Carlos III/CIBERSAM. BH is supported by a New Investigator Award from the Canadian Institutes of Health Research and the Drug Safety and Effectiveness Network. MR and EB-D are supported by the Institute of Health Carlos III/Spanish Health Services Research on Chronic Patients Network (REDISSEC). DM is supported by a University Research Chair, University of Ottawa. The funders were not involved in the design of the protocol or decision to submit the protocol for publication nor will they be involved in any aspect of the study conduct.

  • Disclaimer The views expressed in this manuscript are those of the authors and many not be understood or quoted as being made on behalf of, or reflecting the position of, the funder(s) or any institution.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.