Article Text

Download PDFPDF

On the time spent preparing grant proposals: an observational study of Australian researchers
  1. Danielle L Herbert1,
  2. Adrian G Barnett1,
  3. Philip Clarke2,
  4. Nicholas Graves1
  1. 1School of Public Health & Institute of Health and Biomedical Innovation, Queensland University of Technology, Brisbane, Australia
  2. 2Melbourne School of Population and Global Health, The University of Melbourne, Melbourne, Australia
  1. Correspondence to Dr Danielle L Herbert; d2.herbert{at}qut.edu.au

Abstract

Objective To estimate the time spent by the researchers for preparing grant proposals, and to examine whether spending more time increase the chances of success.

Design Observational study.

Setting The National Health and Medical Research Council (NHMRC) of Australia.

Participants Researchers who submitted one or more NHMRC Project Grant proposals in March 2012.

Main outcome measures Total researcher time spent preparing proposals; funding success as predicted by the time spent.

Results The NHMRC received 3727 proposals of which 3570 were reviewed and 731 (21%) were funded. Among our 285 participants who submitted 632 proposals, 21% were successful. Preparing a new proposal took an average of 38 working days of researcher time and a resubmitted proposal took 28 working days, an overall average of 34 days per proposal. An estimated 550 working years of researchers' time (95% CI 513 to 589) was spent preparing the 3727 proposals, which translates into annual salary costs of AU$66 million. More time spent preparing a proposal did not increase the chances of success for the lead researcher (prevalence ratio (PR) of success for 10 day increase=0.91, 95% credible interval 0.78 to 1.04) or other researchers (PR=0.89, 95% CI 0.67 to 1.17).

Conclusions Considerable time is spent preparing NHMRC Project Grant proposals. As success rates are historically 20–25%, much of this time has no immediate benefit to either the researcher or society, and there are large opportunity costs in lost research output. The application process could be shortened so that only information relevant for peer review, not administration, is collected. This would have little impact on the quality of peer review and the time saved could be reinvested into research.

  • Research funding
  • Evidence based medicine
  • Peer review
  • Statistics & research methods

This is an open-access article distributed under the terms of the Creative Commons Attribution Non-commercial License, which permits use, distribution, and reproduction in any medium, provided the original work is properly cited, the use is non commercial and is otherwise in compliance with the license. See: http://creativecommons.org/licenses/by-nc/3.0/ and http://creativecommons.org/licenses/by-nc/3.0/legalcode

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Article summary

Article focus

  • Researchers would prefer to spend less time preparing grant proposals and more time on actual research.

  • The time spent preparing grant proposals is thought to be large, but we do not have accurate estimates of the total time spent across Australia.

Key messages

  • An estimated 550 working years of the researchers' time was spent preparing proposals for Australia's major health and medical funding scheme.

  • More time spent preparing a proposal did not increase the chances of success and there was no agreement between the researchers' ranking of their proposals and the results from peer review.

  • Most researchers understand that a perfect peer-review system is not realistic.

Strengths and limitations of this study

  • Our time estimates were retrospective with no details on identifying the sections of the proposal that took the most time.

  • We used a short survey to increase the response rate, but this means we have limited data on the participants and their institutions.

  • Many researchers were reluctant to give us their proposal identification numbers, presumably because of confidentiality concerns.

Introduction

Project Grants are the major source of medical research funding in Australia, and were around 70% of all research funds awarded by the National Health and Medical Research Council (NHMRC) in 2012.1 Application numbers have steadily risen over time making the process more competitive; there were 1881 proposals in 2003 and 3727 in 2012, a 98% increase. For Australian researchers, this increase in proposal numbers has led to declining success rates and budget cuts for successful proposals.

Project Grants aim to support single or small teams of researchers for a defined project from 1 to 5 years. The application process takes almost a year, and has remained essentially the same for the last decade. The funding round opens in December, full proposals are submitted online in March, are assessed by two external reviewers (April–May), lead researchers provide responses to the reviewers' reports (May), grant review panels of 10–12 experts assess each proposal considering reports from two panel spokespersons and the applicants' responses to the reviewers' reports and score each proposal (August–September). Funding is then allocated based on a ranking determined by the score until the budget is exhausted, and the successful proposals are announced (October–November). The budget for Project Grants beginning in 2013 was AU$458 million.

The process which Australia uses, involving the assessment of full proposals, is in contrast to several comparable funding bodies overseas which use staggered application processes. For example, the UK Wellcome Trust Investigator Awards first invite a research plan; shortlisted applicants are then invited to provide more information.2 The UK Engineering and Physical Sciences Research Council (EPSRC) has a similar staggered process for their Platform Grants,3 as does the USA National Science Foundation (NSF). The NSF's guidelines explain that a key reason for short-listing is reducing the wasted effort of researchers spending time preparing proposals with a low chance of success.4

Despite the importance of applying for research funding, the total time spent by researchers preparing and submitting proposals is not known.5 Guidelines on how to effectively write grant proposals advise that they cannot be written in a short amount of time,6 but we do not know if spending more time increases the chance of success. A Nobel Laureate in Physics, and an Australian-based researcher, Professor Brian Schmidt, recently highlighted the large amount of time the Australian researchers were wasting on preparing lengthy proposals for Australian Research Council funding.7

We surveyed the Australian medical research community in order to estimate their time spent preparing proposals and whether spending more time increased their chance of success. We also examined whether previous experience with peer review improved their success.

Methods

Study design

In March 2012, Australian researchers working in health and medicine submitted 3727 proposals to the NHMRC Project Grant funding scheme.8 We attempted to contact the lead researchers of every proposal by contacting the offices of research of every Australian university and research institute. Of the 51 offices approached, 30 (59%) agreed to distribute an email invitation to their researchers. There was no reminder email. Willing researchers completed a short online survey from March to May 2012. The funding outcomes were announced by the NHMRC in October 2012. This study was approved by the Queensland University of Technology Ethics Committee (approval number 1100001472).

Survey questions

The online survey asked researchers to consider their time spent on proposals submitted in March 2012. For each proposal, we asked them if they were the lead researcher and how much time they spent (in days), and whether the proposal was new or a resubmission. We also asked them about their previous experience with the peer-review system as an expert panel member or external peer reviewer, which is roughly akin to being a peer reviewer for a journal and part of the editorial board. We asked for their salary in order to estimate the financial costs of preparing proposals. To protect the anonymity of our participants and to minimise their time spent completing the survey, we did not ask them for extra-personal details or for the name of their institution.

For researchers who submitted two or more proposals, we asked them to rank their proposals in the order of which deserved most funding. Researchers also responded to a hypothetical scenario concerning their desired level of reliability between two independent peer-review panels (box 1). This was used to estimate the desired reliability of the peer-review process. The hypothetical numbers of 100 proposals and 20 funded were based on a realistic NHMRC Project Grant panel.

Box 1

Hypothetical scenario on peer-review reliability

Question: Imagine that 100 Project Grant proposals in the same field have been reviewed by a panel of 10 experts. They selected 20 proposals for funding.

Now imagine that a second panel of 10 experts reviews the same 100 proposals and must independently decide on which 20 proposals deserve funding. How many of the 20 proposals originally selected for funding would you want to also be selected by the second panel?

Response Options: Exactly the same 20 proposals, a difference of 1 proposal, […], 20 completely different proposals.

Statistical methods

The total number of days spent preparing proposals was estimated using the following equation:Embedded Image where 3727 is the total number of proposals in 2012, P is the proportion of resubmitted proposals, T() is the average time spent in days for a combination of new or resubmitted (N or R) proposals, lead or other researchers (L or O), and M is the average number of researchers per proposal. This equation recognises that the resubmitted proposals usually take less time than new proposals, and that lead researchers generally spend more time than the other researchers. This estimate on the scale of working days was scaled to working years by assuming 46 working weeks per year. A bootstrap 95% CI was calculated by randomly resampling from the observed responses to capture the uncertainty in the time spent, number of researchers and proportion of resubmissions.9 Of the 3727 proposals submitted, 18 were subsequently withdrawn.8 These withdrawn proposals were included in our estimate of the total time, as this time is still valid for our aim of capturing the total researcher time spent preparing proposals across Australia.

We used logistic regression to estimate the prevalence ratio (PR) of success according to the researcher's experience and time spent on the proposal. PRs are the ratio of two probabilities, whereas odds ratios (ORs) are the ratios of two odds.10 Using PRs allows us to make multiplicative statements about probabilities (eg, twice as likely) that are not possible with ORs.

There were small amounts of missing data (0–7%) for the questions on researcher experience and times. These missing data were imputed using multiple imputation based on the observed responses. For example, 35% said that they had previously served on a peer-review panel, hence missing values to this question were randomly imputed as ‘Yes’ with probability 0.35.

The imputation and logistic regression models were performed simultaneously using a Bayesian model, hence the final estimates of the PRs for success incorporate the uncertainty due to missing data. The model was fitted using the Bayesian WinBUGS software11 and the PRs are presented as means with 95% credible intervals.

We examined potential non-linear associations between time spent and success. These were a threshold beyond which more time did not increase the probability of success, log-transformed time and a quadratic association; however, we found no statistically significant associations (results not shown).

We compared the researchers' ranking of their proposals with their success or failure in the peer-review system. For each pair of proposals from the same researcher, we compared their relative low and high ranking with their funding success (yes or no). We only examined those proposals where there was a difference in success, as pairs of grants that were both failures or both successes contain no information for this analysis. We examined these results using a two-by-two table, χ2-test and κ agreement statistic.

Results

Our online survey was started by 446 researchers, but only 285 (64%) provided us with their proposal number(s). We needed the proposal numbers in order to match the survey responses (completed from March to May 2012) with the success outcomes from the NHMRC (announced in October 2012). However, many researchers were reluctant to give us this information. The 285 who gave us their proposal numbers submitted 632 proposals. The funding success rate in our sample was 21%, the same as the overall NHMRC success rate (21%) which indicates that our sample was representative of the wider population. The NHMRC received 3727 proposals of which 3570 were reviewed and 731 were funded, giving a success rate of 21%.8

An estimated 550 working years of researchers' time was spent preparing the 3727 proposals (95% CI 513 to 589 working-years). Based on the researchers' salaries, this is an estimated monetary cost of AU$66 million per year, which is 14% of the NHMRC's total funding budget. Each new proposal took an average of 38 working days of the researchers' time and resubmissions took an average of 28 working days: an overall average of 34 days per proposal. Lead researchers spent an average of 27 and 21 workings days per new and resubmitted proposals, respectively, with the remaining time spent by other researchers.

More time spent on the proposal did not increase the probability of success (table 1). Owing to concern about a lack of power to detect an association between time spent and success, we used a retrospective power calculation. We had a 90% power to detect an increase in the probability of success of 0.028 for a 10-day increase in the time spent (based on the observed times and successes of our sample). If we have missed a true association, it is likely to be smaller than a 0.028 increase in probability for 10 more days of the time spent.

Table 1

Prevalence ratios of funding success by researcher experience and time spent on proposal

Experience with the peer-review system, as either an expert panel member or external peer reviewer, did increase the probability of success, but these increases were not statistically significant (table 1). Resubmitted proposals had a statistically significant lower probability of success compared with new proposals (PR 0.64, 95% CI 0.43 to 0.92).

There was no agreement between the researchers' rankings of their proposals and which ones were funded (table 2). The χ2 test showed no association (χ2=0.93, p=0.34) and the κ agreement was negative (−0.06).

Table 2

Agreement between researchers’ relative ranking of their proposals and funding success

Researchers were willing to accept a wide range in reliability between two hypothetical peer-review processes (figure 1). The modal response was a difference of five proposals (meaning 15 the same), which is a 25% disagreement in funding between the two processes.

Figure 1

Desired reliability of a hypothetical system (see box 1 for hypothetical question).

Discussion

Australian researchers spend an enormous amount of time preparing grant proposals.12 We estimate that the 2012 NHMRC Project Grant scheme costs 550 working years of researchers' time, which is AU$66 million in terms of the estimated salary costs. To put this quantum of resources into perspective, it exceeds the total annual staff costs at the Walter and Eliza Hall Institute (WEHI 2012, AU$61.6 million), one of Australia's major medical research institutes which produced 284 peer-reviewed publications in 2012.13

As success rates for the Project Grant scheme are historically between 20% and 25%, the majority of time spent preparing proposals is wasted with no immediate benefit due to the failure to obtain funding. Some wasted time will be salvaged by submitting failed proposals to other funding agencies or resubmitting next year. However, resubmissions took just 10 days less on average to prepare than new submissions, and resubmissions had a 36% lower probability of success (table 1).

Spending more time on a proposal is no predictor of success (table 1), and the poor agreement between researchers' rankings and funding success (table 2) further demonstrates how hard it is to predict success and justify spending more time on proposals. These findings are consistent with the previous studies on NHMRC Project Grants that have shown a high degree of variation in panel members' scores14 and a low correlation between the scores assigned for track record and bibliometric measures.15

Underestimating time and cost

Our cost estimates are likely to underestimate the true costs because some proposals are started but not submitted, and we did not capture the time of researchers who provided technical help or administrative staff who helped with the submission process. Also, our estimates do not include the costs of peer review, which would be the time of 1–3 external peer reviewers per proposal and an expert panel of 10–12 senior researchers meeting for a week, as well as the administrative time of organising this peer review.

Our findings are based on retrospective self-reported times spent preparing proposals, and we could not verify these times. Our study was designed to minimise participant burden and maximise our response rate by using a short survey that maintained anonymity. Participants completed our survey soon after the NHMRC closing date for submissions which should have reduced recall bias. At the time of completing, the survey participants did not know if their proposal had succeeded, hence our results are not biased by disgruntled researchers inflating their times. Future research could use diaries to prospectively collect the time spent preparing proposals and identify the sections of the proposal that took the most time. Future research could also examine whether preparing unsuccessful proposals provides any benefits to the researchers in terms of refining their scientific ideas.

Excessive information

Researchers would prefer to spend less time writing proposals and more time on actual research.16 Our results show that most researchers do not expect a perfect system (figure 1). Hence, the amount of information collected does not need to aim for the ‘ideal’ system shown in figure 2. Most researchers understand that a perfect system is unachievable. The hypothetical association between the information that the system collects (which determines the time spent by researchers) and the accuracy of the system is plotted in figure 2. Underlying the figure is the notion that the marginal cost of providing more information is rising (which is consistent with our results regarding time spent on grant preparation and success) and that the marginal benefit flowing from this information in improving the ranking of proposals is declining.17 The standard way of optimising the amount of information collected is to equate the marginal benefits with the marginal costs, which occur at the maximum net benefit. Beyond this point, marginal costs to the applicant outweigh the benefits even though there may still be improvements in the accuracy of ranking. One may also reach a point where the net benefits become negative, when additional information only confuses the ranking process.

Figure 2

Hypothetical association between the information collected for peer review and the accuracy of awarding the best proposals. To draw this association, we assume that all proposals can be ranked (without ties) from the best to the worst.

Our results suggest that the current NHMRC Project Grant system collects more information than what is necessary as the association between time spent (at an individual level) and success was negative (table 1), putting it on the downward slope of figure 2. Project Grant proposals are between 80 and 120 pages long and panel members are expected to read and rank between 50 and 100 proposals. It is optimistic to expect accurate judgements in this sea of excessive information. An alternative application process is to use an initial short proposal with shortlisted proposals being asked to provide more information that would then be used to determine funding success.

Recommendations to minimise burden

Our time estimates are comparable with two small Australian studies on the time spent preparing proposals for NHMRC Project Grants. In 2004, a sample of 69 researchers spent an average of 20 days per proposal.18 In 2009, a sample of 42 lead researchers spent between 20 and 30 days per proposal, which, when extrapolated to the whole of Australia, gave an estimated total preparation cost of AU$41 million.14 In 2012, the Canadian Institutes of Health Research review of their Open Operating Grant Program included a survey of 378 researchers who spent on average 169 h (or 23 working days at 7.5 h per day) per proposal.19 In Canada, new recommended reforms include a reduction in the amount of information submitted to minimise burden on applicants and peer reviewers.19

A recent review of health and medical research funding in Australia recommended that the NHMRC's online application process be simplified.20 We not only agree but also believe that the information requested for each proposal could be reduced. This is because the key scientific information used to judge a Project Grant's worthiness is just nine pages of a proposal, that is, around 80 to 120 pages. Therefore, the proposals could easily be shortened without any impact on peer review. The inclusion of a staged application process starting with an expression of interest (EOI), as used in the UK and the USA, would further minimise the burden on researchers. If an EOI could be used to reject 30% of proposals, and assuming that an EOI takes one-quarter of the time to prepare as a full proposal, then (based on our survey) this would save 124 years of the researchers' time per year. This saved time is equivalent to funding 124 new postdoctoral positions per year.

Changes to eligibility rules for resubmitting proposals from previous funding rounds could reduce the total number of applications and improve success rates. The UK proposals submitted to the EPSRC Platform Grant scheme (2009–2010 to 2011–2012) have almost halved (3379 vs 1938) and the success rate increased (30% vs 41%) after EPSRC implemented stricter eligibility rules including a repeatedly unsuccessful applicants policy.3 From our survey, the success rate for new proposals was higher than for resubmissions, therefore the limitations on the resubmission of Project Grants may reduce the time wasted preparing proposals by improving the chance of success.

The format of grant proposals could be shortened so that only information relevant for peer review, not administration, is collected. The administrative data could be collected at a later date for only those proposals that were successful. Another option is to restructure the format of proposals based on the total budget, where projects with smaller budgets can submit shorter proposals. The potential savings in the researchers' time are enormous since preparing research proposals takes between 1 and 3 months in a year. If more of this time could be dedicated to actual research, then there would be more and faster medical research discoveries. Weighing down researchers in a lengthy grant proposal process is a poor use of the researchers' valuable time.

Acknowledgments

The authors are grateful to the Australian researchers who provided the survey data.

References

Footnotes

  • Contributors AGB, PC and NG conceived and designed the study and analysed the data. All authors interpreted the data, drafted the article or revised it critically for important intellectual content and approved the version to be published. AGB is the study chief investigator and is the guarantor.

  • Funding This work was funded by the National Health and Medical Research Council (Project Grant number 1023735).

  • Competing interests DLH salary is supported from NHMRC funding. AGB receives funding from NHMRC and QLD Government. PC receives funding from NHMRC, NIH and several other national and international health funding agencies. NG receives funding from NHMRC, ARC, NIHR, QLD Government and is the academic director of the Australian Centre for Health Services Innovation.

  • Ethics approval Queensland University of Technology Ethics Committee.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement No additional data are available.