Article Text

Download PDFPDF

Time to publication for NIHR HTA programme-funded research: a cohort study
  1. Fay Chinnery1,
  2. Amanda Young1,
  3. Jennie Goodman1,
  4. Martin Ashton-Key1,
  5. Ruairidh Milne2
  1. 1National Institute for Health Research Evaluation, Trials and Studies Coordinating Centre (NETSCC), University of Southampton, Southampton, Hants, UK
  2. 2Wessex Institute, University of Southampton, Southampton, Hampshire, UK
  1. Correspondence to Dr Fay Chinnery; F.Chinnery{at}soton.ac.uk

Abstract

Objective To assess the time to publication of primary research and evidence syntheses funded by the National Institute for Health Research (NIHR) Health Technology Assessment (HTA) Programme published as a monograph in Health Technology Assessment and as a journal article in the wider biomedical literature.

Study design Retrospective cohort study.

Setting Primary research and evidence synthesis projects funded by the HTA Programme were included in the cohort if they were registered in the NIHR research programmes database and was planned to submit the draft final report for publication in Health Technology Assessment on or before 9 December 2011.

Main outcome measures The median time to publication and publication at 30 months in Health Technology Assessment and in an external journal were determined by searching the NIHR research programmes database and HTA Programme website.

Results Of 458 included projects, 184 (40.2%) were primary research projects and 274 (59.8%) were evidence syntheses. A total of 155 primary research projects had a completion date; the median time to publication was 23 months (26.5 and 35.5 months to publish a monograph and to publish in an external journal, respectively) and 69% were published within 30 months. The median time to publication of HTA-funded trials (n=126) was 24 months and 67.5% were published within 30 months. Among the evidence syntheses with a protocol online date (n=223), the median time to publication was 25.5 months (28 months to publication as a monograph), but only 44.4% of evidence synthesis projects were published in an external journal. 65% of evidence synthesis studies had been published within 30.0 months.

Conclusions Research funded by the HTA Programme publishes promptly. The importance of Health Technology Assessment was highlighted as the median time to publication was 9 months shorter for a monograph than an external journal article.

  • STATISTICS & RESEARCH METHODS

This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 3.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/3.0/

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Strengths and limitations of this study

  • The study involves a large cohort, representing almost 20 years of research funded on behalf of the National Health Service.

  • This report complements previous work which has shown that 98% of Health Technology Assessment projects funded since 2002 will publish a monograph.

  • This project relied heavily on the National Institute for Health Research programmes database and some data were not available for the analyses.

Introduction

In order for research to help patients and aid clinicians in their decision-making it must be published in full and made available in a timely fashion. However, it is estimated that over 50% of studies are never published completely, and studies with disappointing (non-significant) results may not be published at all.1 ,2 Non-publication is believed to be primarily due to failure to write-up and submit research, rather than manuscripts being rejected.3 Studies with null or negative findings take longer to be published than those with positive results,4 ,5 and this publication bias may invalidate a meta-analysis, leading to overestimation of treatment effects. As a result, new interventions may be adopted without suitable evidence to support them.

During 2011/2012, the NIHR invested £202.2 million in research across a broad range of programmes and initiatives. Health Technology Assessment (also known as the monograph series) is the peer reviewed journal for the NIHR HTA Programme. Reports published in Health Technology Assessment provide a full account of the research project, including methods and a full description of the results. These full monographs complement shorter articles submitted for publication in other peer-reviewed journals, which the NIHR actively encourages researchers to do as part of their dissemination strategy.

In addition to publication bias, selective outcome reporting may also lead to overestimation of the effectiveness of the treatment, emphasising the need for rigorous reporting of research. Trials funded by the NIHR HTA Programme that only publish in Health Technology Assessment tend to have a higher p value for the main outcome compared with those that also have a publication in another journal. The full Health Technology Assessment monograph generally contains more outcomes than the main trial publication, and journal articles tend to report a higher proportion of statistically significant outcomes. Consequently, researchers including HTA-funded trials in their systematic reviews are recommended to use information from the monograph and not from the associated journal article.6

Turner et al7 have shown that 98% of projects funded by the HTA Programme in the past 10 years were published in the monograph series. In contrast, Ross et al8 found that only 68% of clinical trials funded by the US National Institutes of Health (NIH) were published, with 46% being published within 30.0 months of trial completion. Tricco et al9 established that Cochrane reviews have a median time to publication of 2.4 years (∼29.0 months), but only 80.9% of Cochrane protocols are published overall. Given the importance of publishing promptly and the recommendation that researchers use data from the monograph of a project, rather than from its journal article; the aim of this study was to determine the time to publication for HTA-funded primary research and evidence synthesis projects in Health Technology Assessment and biomedical literature, and to compare time to publication with other organisations that fund or evaluate research.

Methods

Cohort sample

The cohort in this project is derived from the NIHR research programmes database. It is a subsample of the dataset used by Turner et al7 and includes projects that planned to submit their draft final report on or before 9 December 2011 (as recorded in the NIHR research programmes database). Based on project classification in the database, the cohort was divided into two main categories: primary research and evidence synthesis; primary research was subdivided further into trials (as defined by Ross et al8) and the remainder were categorised as ‘others’.

Data extracted from the database included the project reference number, its publication date in Health Technology Assessment and the date when the evidence syntheses protocols were made available online. The Health Technology Assessment monograph (or draft final report or external publication if the project did not have a published monograph) was manually searched for the end of recruitment date and length of follow-up in order to calculate the study conclusion date for the primary research projects. We also manually searched the Health Technology Assessment journal website for the online publication date of the first report for all projects in an external journal. We took a pragmatic approach and excluded protocols, background papers and systematic reviews that may have been conducted before the research began. We included the first report that used clinical data from the project, and excluded cost-effectiveness analyses (unless the project report specifically stated that it was an economic evaluation).

Time to publication

For primary research, the time to publication was determined by calculating the number of months from when the study concluded (ie, end of follow-up, using the same methodology as Ross et al8) to when the monograph was first published online and to when the first external publication was available online. For evidence syntheses, we followed the protocol of Tricco et al.9 Time to publication was measured as the number of months from when the protocol was first made available online to the online publication date of the monograph and to the online availability of the study in an external journal.

Three researchers (FC, MA-K and JG) conducted data extraction for the primary research dataset and any disagreement was resolved in discussion. Two researchers (FC and JG) extracted the data for the evidence synthesis projects. Again, any disagreement was settled in discussion. In the case of primary research, the first output registered was often the protocol or a background paper; consequently, two researchers (AY and FC) manually searched the HTA journal website to determine the publication date of the first report from a project and this date was confirmed in discussion.

Data analysis

Kaplan-Meier survival curves were produced for primary research and evidence synthesis projects; the percentage of HTA-funded studies published in the monograph series was compared to other peer-reviewed journals. We calculated the median (time for 50% of funded studies to publish) time to publication in Health Technology Assessment, elsewhere and for the first output for primary research, trials and evidence syntheses.

Ross et al8 have emphasised the need for timely publication and have stated a cut-off of 30 months for trials funded by the NIH. We also calculated the percentage of HTA-funded studies published at 30 months and the total percentage published, both in the monograph series and elsewhere.

Minitab was used to establish distribution of the data subsets (Anderson-Darling normality test) and the IQR were also determined. Any statistical difference between the median times to publication was established using the Mann-Whitney U test.

Results

We identified 458 projects for inclusion in our analyses (figure 1).

Figure 1

Flow diagram of projects in this study.

Primary research

The primary research subset contains 184 projects; however, 29 of these did not state an end of recruitment date, or it was not possible to determine the length of follow-up. Consequently, it was not possible to calculate the last point of data collection for 15.8% of HTA Programme-funded primary research, even though many of these studies do have a publication.

Of the 155 primary research projects with a completion date, the median time to any publication (time for 50% of the funded studies to publish) was 23 months (IQR 19.0 months) and 26.5 months (IQR 20.5 months) for publication as a monograph in Health Technology Assessment and 35.5 months (IQR 19.0 months) for publication in any other external journal, but this difference was not statistically significant (p=0.149).

Sixty-nine per cent of all primary research funded by the HTA Programme is published within 30 months, but only 56.1% of monographs are produced within this time. Limiting the analysis to trials, directly comparable to the work of Ross et al,8 67.5% were published within 30 months and have a median time to publication of 24 months (IQR 15.3 months; table 1). The overall publication rates are 92.9% for any publication, 88.4% in the monograph and 62.6% in an external journal (table 1, figure 2).

Table 1

Publication characteristics of HTA Programme-funded primary research and trials (studies with a completion date)

Figure 2

Cumulative percentage of Health Technology Assessment-funded primary research (studies with a study completion date). Publication rate in the Health Technology Assessment monograph versus other peer-reviewed biomedical journals and time to the first publication anywhere.

Evidence synthesis

Of the 274 evidence syntheses, the database did not record a protocol online date for 51 (18.6%) projects and so these could not be included in further analyses. Of the remaining projects, the median time to any publication was 25.5 months (IQR 16 months) and the median time to publication of a monograph is 28.0 months (IQR 19 months) but, unlike primary research, fewer than 50% of evidence synthesis projects were published in other peer-reviewed journals (table 2 and figure 3), so it was not possible to test for statistical significance. Evidence syntheses were published in a timely fashion, with 65% of studies being published within 30 months and 93.3% were published overall.

Table 2

Publication characteristics of HTA Programme-funded evidence synthesis (studies with a protocol online date)

Figure 3

Cumulative percentage of Health Technology Assessment-funded evidence syntheses (studies with a protocol online date). Publication rate in the Health Technology Assessment monograph versus other peer-reviewed biomedical journals and time to the first publication anywhere.

Discussion

Using the standard of Ross et al,8 HTA-funded research publishes promptly; 69% of primary research projects were published within 30 months, with a median time to publication of 23 months. Sixty-five per cent of evidence synthesis projects were published within 30 months and the median time to publication was 25.5 months.

Strengths and limitations

The main strength of this study is that it involves a large cohort, representing almost 20 years of research funded on behalf of the National Health Service. This report complements previous work which has shown that 98% of HTA projects funded since 2002 will publish a monograph.7 This project used a subsample of the dataset of Turner et al7 with the intention to determine the time to publication of all of the primary research and evidence synthesis projects that do publish. However, a major limitation of this project is the amount of data missing from the analyses. It was not possible to determine the end of follow-up for over 15% of primary research projects, and over 18% of the evidence synthesis studies did not have a recorded protocol online date, so they were not included in the analyses. Since data-recording was poorer in earlier years (unpublished data), we have disproportionately excluded more of the older projects. Consequently, since older projects generally took more time to publish (unpublished data), we may be underestimating the time HTA-funded studies take to publish overall.

This project relied heavily on data from the NIHR research programmes database and the Health Technology Assessment journal website to determine whether a study has been published elsewhere, which in turn depends on self-declarations from the PIs, as per contractual obligations. Preliminary work in an internal NETSCC report found that the PIs were under-reporting their external publications by 15.8% and so the overall external publication rate is likely to be higher and we are overestimating the median time to publication in an external journal. In addition, the under-reporting may also be affecting the ‘Any publication’ Kaplan-Meier curve and thereby influencing the median time to the first publication as well.

Comparison with other studies

Ross et al8 highlighted the need for the publication process to be prioritised in order to shorten the time taken for research findings to be available to the public. Their work found that the median time to publication of clinical trials funded by the US NIH and registered with ClinicalTrials.gov (and completed by 31 December 2008) was 23 months. However, this is only the median of the trials that have been published, not the whole cohort (ie, the trials that were funded) and so it is underestimating the time to publication. Funders and researchers should aspire to publish all of their research, so the time taken for 50% of all funded studies to publish is the appropriate median time to publication. Arguably, the publication rate at 30 months may be the truly important measure of timeliness to publication.

It takes ∼32 months for half of the clinical trials funded by the US NIH to publish; only 46% were published within 30 months of trial completion, with an overall publication rate of 68%. In comparison, the median time to publication of HTA Programme-funded trials was 24 months, 67.5% being published within 30 months, and 93.7% were published overall. The Health Technology Assessment figures also compare very favourably with the results from industry sponsored trials; trials conducted by GlaxoSmithKline in Spain between 2001 and 2006 had a publication rate of 61% and a median time to publication of 28.4 months. However, it was not clear whether this was the median of the published trials or of the funded ones.10 The median time to publication of more recent NIH clinical trials (those with a ClinicalTrials.gov identifier, published during 2009 and indexed in MEDLINE) is 21 months,11 but the study did not comment on how long it took for 50% of the funded trials to publish. Finally, 68.0% of NIH-funded studies were published overall and 62.6% of HTA-funded primary research were published externally. This highlights the importance of the monograph series as it provides a means of publication for those projects that would not otherwise reach the public domain.

HTA-funded evidence syntheses are also produced in a timely manner, with a median time to publication of 25.5 months and 65% of studies being published within 30 months (93.3% publishing overall). In comparison, Cochrane reviews have a median time to publication of ∼29 months, with only 80.9% publishing in full after 8 years of follow-up.

Implications

The median time to publication in the monograph series and an external journal could only be compared for primary research (as over half of the evidence syntheses do not have a recorded external publication); here a monograph is produced 9 months earlier. Publication rate at 30 months and in total, for both types of research, was considerably higher in the monograph series than for other peer-reviewed biomedical journals. The shorter time to publication and high publication rate in Health Technology Assessment is laudable; ensuring information from research is easily accessible and widely available is important because it facilitates its use, increases its impact and consequently its value to society. Unpublished data may also invalidate the conclusions from meta-analyses and systematic reviews. These are not just a valuable source of information for healthcare professionals and researchers, but definitive conclusions about an intervention also prevent putting more patients at risk in further unneeded trials or depriving them of the correct treatment. Having Health Technology Assessment is clearly important for dissemination of research to the public in a timely fashion and to ensure that data are not lost as a result of publication bias.

Conclusion and recommendations

Research funded by the HTA Programme is published in a timely fashion; where a comparison was possible, time to publication was 9 months shorter for a monograph than an external journal article and publication rate was considerably higher in Health Technology Assessment than for other peer-reviewed journals, both overall and at 30 months. HTA-funded trials publish more promptly than those funded by the NIH and industry and HTA-funded evidence syntheses are produced sooner than Cochrane reviews. This current study highlights the importance of HTA Programme research being funded via a contract that obliges researchers to publish their findings in full.

Recommendations include encouraging other funding organisations to make it a condition for their investigators to publish final project results in full, within a set time and to support this practice, regardless of whether findings are significant or not. In the UK, the Health Research Authority (HRA) is responsible for protecting and promoting the interests of patients and the public in health research. It plays a key leadership role in promoting transparency and has made a number of commitments to ensure the publication and dissemination of health research results.12

Future work should investigate the time to publication for other funders and ways in which delays can be reduced without compromising the quality. Regardless of the funder, all trials should be registered and the methods and results should be reported in full, as called for by the AllTrials initiative,13 ,14 in a timely fashion.

Acknowledgments

The authors would like to thank Sheila Turner for providing the reference numbers of the projects and their publication date in the monograph series; Stephen Lemon for his advice about data extraction from the NIHR research programmes database and Jo Merritt for sharing her data concerning authors not reporting projects that publish in an external journal. The authors would also like to acknowledge the Metadata team for providing their database and trial details used in the study.

References

Footnotes

  • Contributors The study was designed by RM, MA-K, FC and AY. FC, AY, JG and MA-K performed data extraction, FC and AY conducted the data analyses. FC drafted the manuscript, guided by MA-K, RM and AY. All authors have read and approved the final manuscript.

  • Funding This research was supported by the NIHR Evaluation, Trials and Studies Coordinating Centre through its Research on Research programme.

  • Competing interests FC has worked for the NIHR Evaluation Trials and Studies Coordinating Centre (NETSCC) since February 2012; AY is an employee of NETSCC which hosts the Research on Research Programme, from where this work originated; JG worked for NETSCC from January 2006 to April 2013; MA-K is currently an editor for the Health Technology Assessment journal and a full-time employee of NETSCC; RM is employed as the Head of NETSCC and has worked for NETSCC (and its predecessor organisation) in senior roles on and off since 1996. He was an editor of the Health Technology Assessment journal (1997–2005) and a founder editor for other journals in the new NIHR Journals Library (2011–2012).

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement No additional data are available.