Statistics from Altmetric.com
Strengths and limitations of this study
This is the first study to prospectively test a simplified funding application process.
Simplified peer review processes save time and resources that could be spent on actual research.
The current lengthy process could be simplified without impacting greatly on funding outcomes by using a simplified panel.
The sample size was small, and this is because of the costs involved in the additional peer review.
Funding agencies use peer review to identify which proposals to fund, but the evidence for the effectiveness of peer review for funding research is lacking.1 Large costs are incurred in assembling the people and information required to allocate research funding2 including: the applicants’ time spent preparing a proposal3–5; the peer reviewers’ time6–8; and the administrative burden on institutions and funding agencies.8–10 Our previous research estimated that 547 working years of researchers’ time ($A66 million in salary costs) was spent preparing Project Grant proposals for the National Health and Medical Research Council (NHMRC),3 and 2 years later, this had increased to 614 working years.11 It could be possible to reduce these high application costs without negatively impacting on funding decisions.
The peer review of funding proposals is labour intensive. Most funding agencies use face-to-face meetings combined with prior assessment from panel members and external reviewers, for example, Canadian Institutes of Health Research,12 Engineering and Physical Science Research Council (UK),13 Medical Research Council (UK),14 National Institutes of Health (USA),15 and the National Science Foundation (USA).16 Proposals are long and detailed, and take time to prepare and assess. A simplified funding system would give researchers, as applicants or peer reviewers, more time for their research—an issue recognised more than three decades ago.17 ,18
Changes to a funding peer review process need to be evaluated in terms of the change in funding outcomes and change in costs. For funding outcomes, the key measure is the agreement between the changed process and the official process. Only a handful of studies have experimentally examined the agreement of funding processes. In 1977, the US National Science Foundation re-reviewed 150 proposals using a second independent peer review panel and found a 24–30% disagreement in funding outcomes.17 A Canadian study of 248 proposals submitted to two major funding agencies with similar peer review processes found a 27% disagreement in funding.19 In 2009, the Academy of Finland randomly assigned peer reviewers to two panels assessing the same 65 proposals, and found a 31–35% disagreement.20 These studies have a 65–76% agreement and 24–35% disagreement. We similarly found that a 75% agreement was the median acceptable agreement for funding peer review in a survey of Australian researchers based on a hypothetical peer review scenario.3 The scenario was that researchers were asked to imagine that 100 proposals had been assessed and that 20 had been funded, and were then asked how many of these 20 proposals they would want to be selected by a second independent panel.
The objective of this study is to prospectively test shortened proposals and simplified peer review processes for the main funding scheme of the NHMRC of Australia. This involved the parallel assessment of actual proposals submitted to the NHMRC's Project Grant scheme in 2013. There were 3821 Project Grant proposals and the success rate was 16.9% with a total budget of $A419.6 million. We aimed to identify the agreement between the official process and the two simplified processes, and the peer review cost savings for the simplified processes.
This study uses data from simplified and journal peer review panels organised by the research team (figure 1), and the official NHMRC panels for Project Grant proposals.
The target research areas were Basic Science and Public Health. These areas were selected based on the findings from a NHMRC study that identified high (Basic Science) and low (Public Health) correlations between the track record scores from the official panels in 2001 and the corresponding bibliometric measures.21 These two fields were therefore chosen with the aim of examining the widest expected range in agreement.
A sample of 72 Project Grant proposals submitted to the NHMRC in March 2013 was voluntarily provided to the team by Australian researchers in response to email invitations sent through our existing contacts from previous studies. We used our contacts rather than a random sample of researchers in order to reduce the administrative costs of running the study. This may impact on our sample's representativeness, although our contacts covered most Australian cities and a wide range of research institutes. The lead researchers provided our team with their proposals (March–April 2013), and their official NHMRC scores (October–November 2013). The provision of the proposal by the lead researcher was accepted as consent to participate.
For the official NHMRC process there were 43 panels, each with 12 members. During a week-long face-to-face meeting they assessed an average of 91 proposals, each of which was around 100 pages long. Prior to the meeting, the proposals were scored by two or more independent reviewers. Based on these scores, the lowest 33% of applications were labelled ‘not for further consideration’ unless a panel member wanted to rescue them. The remaining applications were discussed in the meeting. Each proposal was summarised by a primary spokesperson, followed by a wider panel discussion, and then followed by scoring. Conflicted panel members did not participate in the discussion or scoring. The mean score was used to create a rank and the proposals were funded in rank order until the budget ran out. The key information for our study is funded (yes or no).
We used a simplified process where panel members reviewed a shortened proposal which included the nine-page research plan and a two-page track record for each chief investigator. A list of sections used is given in table 1. The simplified panels were convened by our research team in June 2013 before the official panels (July–September 2013). Our findings had no bearing on the official awarding of funding in October 2013. Members of the simplified panel did not participate in the official panels, but they may have participated as external reviewers for other proposals.
Panel members provided written consent to participate, signed a confidentiality agreement and were paid an honorarium for their participation. The payment of travel expenses, accommodation and an honorarium is standard policy for the official panels to attend a face-to-face meeting.
Each seven-person simplified panel reviewed either 36 Basic Science or 36 Public Health proposals in separate 1.5-day face-to-face meetings. Each panel member was a spokesperson for five or six proposals, and they gave an opening summary of the strengths and weaknesses of the proposal. The panel was allowed a maximum of 15 minutes to discuss each proposal. Before the discussion, the panel chair asked all panel members if they had any real or perceived conflicts; the conflict rules were used to match the official peer review process. All scores were given by written secret ballot, and there was no group discussion of the scores.
The journal panels were designed to work like most journals, where the decision to publish is based on the results of two or more independent reviewers. We used two journal panel reviewers per proposal, who only considered the nine-page research plan, reference list and synopsis (table 1). Each panel member reviewed and scored either 6 or 12 proposals (May–August 2013). Proposals were assigned to reviewers based on their expertise in Basic Science or Public Health and with an absence of conflicts of interest.
The official panels rank proposals using a weighted calculation using three criteria-based integer scores (from a low 1 to a high 7) for scientific quality, significance and innovation, and track record. The scores are used to determine an overall ranking and the highest ranked proposals are awarded funding within the budget limitations. Despite the seven-point scale, proposals typically receive one of three category scores. For example, in 2013, almost all proposals scored a 4, 5 or 6 (94.8%); the highest category of 7 (0.1%) and lowest categories of 3 (4.9%), 2 (0.2%) or 1 (nil) are rarely or never used.
We used a simplified scoring process where panel members rated each proposal as: definitely fund, possibly fund, or definitely do not fund. This simplified score is designed to help peer reviewers focus on the actual decision rather than on a more complex criteria-based scoring system, which is a step removed from the final decision and has been described as oblique by some reviewers.22 We awarded funding in our simplified panel if 50% or more of the seven-person panel recommended ‘definitely fund’, and for our journal panel if both external reviewers recommended ‘definitely fund’.
Cross-tabulations were used to examine the agreement between the simplified and official panels for the dichotomous funding outcomes (yes or no). The main outcome is the percentage agreement in funding, for which CIs were generated using a bootstrap algorithm. We use agreement because our aim was to find processes that were as good as the official process, but with lower costs. Our previous survey of Australian researchers found the median threshold (from 145 responses) of acceptable agreement for two hypothetical review panels assessing the same proposals was 75%; therefore, this level is a meaningful threshold for interpreting acceptable agreement.3 We apply this threshold to the percentage agreement without adjusting for chance agreement, as this is the agreement that would be observed in practice.
Data on time spent and travel were used to estimate the costs of peer review. Members of the simplified panels reported their time spent reviewing the 36 proposals in preparation for the face-to-face meeting, and their time spent preparing a spokesperson report for each allocated proposal. Travel and accommodation costs to convene the face-to-face meetings were also included. The journal panel reported on their time spent reviewing each proposal.
The R package (V.3.0.2) was used for all analyses.
Official and simplified panel members
Most panel members had senior academic appointments of Professor or Associate Professor, and had prior experience of being a NHMRC peer reviewer (table 2). Compared with the official panels, our panels had more women and more members from Group of Eight universities, but were similar in terms of academic level.
Agreement between the simplified and official processes
The mean agreement between the simplified and official panels (72%, 95% CI 61% to 82%), and the journal and official panels (74%, 62% to 83%) was just below the acceptable threshold of 75% (table 3). The agreement about which proposals to fund was lower than the agreement about which proposals not to fund. This is partly because many more proposals were not funded than funded. The agreement between the simplified and official processes was slightly lower for Basic Science than for Public Health. The mean agreement between the two simplified panels (79%, 68% to 89%) was above the 75% threshold (table 4).
Time spent on simplified peer review
Twice the amount of time was spent reviewing a Basic Science proposal compared with a Public Health proposal (table 5), possibly due to the technical nature of Basic Science proposals. Similar amounts of time were spent preparing a spokesperson report for the simplified panel or a journal panel review. The simplified panel peer review cost $A1109 per proposal, including the costs to attend a face-to-face meeting. The peer review cost for the journal panel dropped to $A359 per proposal because of the smaller number of reviewers, and absence of travel and accommodation costs. The majority of these costs come from the reviewers’ time.
We previously estimated the costs of peer review for the 2009 official funding round to be $A4.44 million for 2983 proposals.23 Based on these figures, the cost per proposal in 2013 was $A1649 (adjusted for inflation). Hence, the estimated cost of the official peer review process in 2013 for 3821 proposals is $A6.3 million. In comparison, the estimated cost of reviewing the same number of proposals using the simplified panels is $A4.2 million and the journal panels is $A1.4 million. This gives estimated savings of $A2.1–$A4.9 million per year from using our simplified review processes.
Using shortened proposals and simplified peer review processes gave a close to adequate agreement with the official NHMRC panels. The NHMRC streamlined the application process for the 2014 round and removed many sections (table 1). Our results indicate that this streamlining would not have greatly altered funding outcomes.
By examining the agreement of the streamlined systems with the current system we imply that the current system is a ‘gold standard’, but the number of peer reviewers per proposal needed to provide anything like a gold standard is in the thousands,24 whereas the current system uses around 12 reviewers per proposal. Despite this, our aim was to show reasonable agreement with the current system in terms of funding, but with lower costs. In other words, we aimed to find an equally imperfect system, but with lower costs. We chose funding as the key (binary) outcome, rather than continuous outcomes such as scores because funding is what matters most to applicants.
A key strength of this study was the rare opportunity to convene experimental peer review panels to assess actual proposals in parallel with the official process. Our relatively small sample size of 72 proposals is comparable to a Finnish study of 65 proposals using two panels.20 Large sample sizes are difficult in this field because of the high costs of using face-to-face meetings.
The success rate for our sample of Basic Science proposals was higher than the official success rate (31% vs 19%), and for Public Health, the success rate was lower (11% vs 13%), indicating some difference in the calibre of the study proposals with the wider population of proposals.25 The much higher success rate in Basic Science may be because the researchers who were willing to provide their proposals for experimental peer review were more senior.
We expect there to be more consensus in funding decisions for the best and worst proposals.22 ,26 A related study of journal peer review found the agreement for paper publication was twice as likely for the rejection of an article compared with acceptance,27 and a related study of funding peer review in Finland found a higher reliability for identifying average and poor proposals than good proposals.20 Our results also show a stronger agreement about what proposals should not be funded compared with what proposals should be funded. This could be because reviewers are consistently able to find proposals that have significant flaws, but find it harder to separate high-quality proposals.
The agreement found in this study is comparable to the small number of other studies of observed agreement (65–76%) when comparing similar or identical peer review systems.18–20 Most researchers understand that peer review processes are unlikely to ever achieve perfect agreement, as even identical peer review processes will give different funding outcomes because of the inherent variability due to subjectivity in peer review.7 ,23 Our comparisons between panels included many sources of variability, including measurement error and variability due to differences in panel members and their preferences, and these sources of variability will always be part of peer review.
Simplified application processes should save time for researchers as applicants and peer reviewers. In this study we only examined the costs saved by peer review which were between $A2.1 and $A4.9 million per year due to reduced travel costs and reviewer time. Our previous research estimated that the majority of costs for the NHMRC Project Grant scheme were for applicants (85%), with the remainder incurred by peer reviewers (9%) and administrators (5%).23 The high applicant costs are due to an average application time of 34 working days. Simplified processes should take less application time and hence save even more costs, although surprisingly our recent research found that time spent on applications increased after the application process was simplified.11
The journal panel did not include track record, but still had reasonable agreement with the official process. This could be because researchers with strong track records are more experienced at writing proposals. An application without track record would save potentially large amounts of application time because each researcher needs to write a two-page CV (curriculum vitae) and keep their publication information up-to-date in the online system.
One potential disadvantage of a journal panel is that by using fewer reviewers there would be more proposals with the same score, and this would create a problem if the funding line straddled a set of tied proposals. In this case either a third reviewer could be sought or the winners could be selected at random on the basis that they are equally good.
The journal panels had a low rate of funding, awarding just 6 of 72 (8%). This could be because both reviewers needed to recommend funding. It could also be because independent reviewers give harsher scores when working alone compared with working in a group. However, two studies that examined the change in preliminary scores after panel discussion found that scores were more likely to get worse than better.20 ,28
Everyone would gain from simplified peer review systems that are cheaper: the funding agencies, institutions, and the researchers as applicants and peer reviewers. Funding agencies around the world face the challenge of a static or diminishing pool of funds. A way to increase the amount of money allocated to research is to improve the efficiency of the process and return the cost savings to the funding pool. Our simplified peer review process can save costs and researchers’ time, and provide estimated savings of $A2.1–$A4.9 million that could be used to fund additional proposals or spend on actual research. The NHMRC has started a Streamlining Application and Assessment Project,29 and the most recent federal budget assigned $A9.9 million over 5 years from 2014–2015 to “develop a nationally consistent approach to the way clinical research trials are overseen and conducted and to streamline and simplify National Health and Medical Research Council grant application and assessment processes”.30 Our results indicate that a very low cost journal-style system with short applications that do not use track record could potentially replace the current more complex and costly system. Funding agencies may want to see more evidence before making such a large change to their systems, and they could do this by running parallel panels that use a simpler system and comparing the outcomes with the standard system. This requires some additional costs to set up the parallel panels, but these one-off costs would be offset by the savings in future funding rounds if the comparison showed that the simpler system performed well.
The authors are grateful to the Australian researchers who provided their funding proposals and participated as panel members.
Twitter Follow Adrian Barnett at @aidybarnett
Contributors DLH, NG, PC and AGB designed the study. DLH led the data collection with input from NG and AGB. AGB led the data analysis with input from DLH, NG and PC. All the authors were involved in the interpretation of the results. DLH wrote the first draft of the manuscript with input from NG, PC and AGB. AGB is the study chief investigator and guarantor.
Funding This work was funded by the National Health and Medical Research Council (NHMRC Project Grant number 1023735).
Competing interests None declared.
Ethics approval Queensland University of Technology Ethics Committee.
Provenance and peer review Not commissioned; externally peer reviewed.
Data sharing statement Full data sets (with some blinding to preserve anonymity) and statistical codes are available from the corresponding author at email@example.com.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.