Article Text
Abstract
Objective Reporting guidelines can improve dissemination and application of findings and help avoid research waste. Recent studies reveal opportunities to improve primary care (PC) reporting. Despite increasing numbers of guidelines, none exists for PC research. This study aims to prioritise candidate reporting items to inform a reporting guideline for PC research.
Design Delphi study conducted by the Consensus Reporting Items for Studies in Primary Care (CRISP) Working Group.
Setting International online survey.
Participants Interdisciplinary PC researchers and research users.
Main outcome measures We drew potential reporting items from literature review and a series of international, interdisciplinary surveys. Using an anonymous, online survey, we asked participants to vote on and whether each candidate item should be included, required or recommended in a PC research reporting guideline. Items advanced to the next Delphi round if they received>50% votes to include. Analysis used descriptive statistics plus synthesis of free-text responses.
Results 98/116 respondents completed round 1 (84% response rate) and 89/98 completed round 2 (91%). Respondents included a variety of healthcare professions, research roles, levels of experience and all five world regions. Round 1 presented 29 potential items, and 25 moved into round 2 after rewording and combining items and adding 2 new items. A majority of round 2 respondents voted to include 23 items (90%–100% for 11 items, 80%–89% for 3 items, 70%–79% for 3 items, 60%–69% for 3 items and 50%–59% for 3 items).
Conclusion Our Delphi study identified items to guide the reporting of PC research that has broad endorsement from the community of producers and users of PC research. We will now use these results to inform the final development of the CRISP guidance for reporting PC research.
- PRIMARY CARE
- STATISTICS & RESEARCH METHODS
- QUALITATIVE RESEARCH
Data availability statement
Data are available upon reasonable request. Full data will be shared with researchers on appropriate request to the corresponding author.
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
Strengths and limitations of this study
Research team from six nations to bring diversity to the analysis.
We included a diverse Delphi panel to include the perspectives of researchers, clinicians, educators and patients.
The Delphi surveys had a high response rate.
Most participants had English as their first language, but a large proportion of the respondents were bilingual.
There were only four clinicians who did not have another non-clinical professional role.
Introduction
Primary care (PC) is a distinct model of healthcare that can improve patient and population health,1 and PC has its own set of research questions that are of interest to the PC community. PC research uses an array of research methods and has developed approaches that emphasise patient-centred, problem-oriented care of whole patients.2 Despite the breadth in topics and approaches that are employed in PC research, there are underlying common elements that are always needed to make PC research useful for researchers, clinicians, patients and policy-makers.
Consensus Reporting Items for Studies in Primary Care (CRISP) is an international, interprofessional, interdisciplinary initiative to help improve the reporting of PC research (http://www.crisp-pc.org/). The goal of CRISP is to improve the quality and usefulness of reports of PC research so that the results may be appropriately applied to improve the process of care and health outcomes for patients and communities.
CRISP research has studied current practices, assessed needs and collated ideas for improvement through a scoping review2 and surveys of PC researchers and clinicians.3 4 These studies have demonstrated the need to improve PC research reporting, documented a desire for research reporting guidelines tailored to the needs and characteristics of PC research and generated lists of specific suggestions for items that would make reports more useful.2–4 Our prior work has emphasised that reports of PC research are not always useful to readers as the reports do not include contextual elements, nor the recognition of competing demands, nor the factors that impact on function such as multidisciplinary teams and therapeutic relationships.
Researchers across many fields recognise the need to improve research reporting.5 6 The Enhancing the QUAlity and Transparency Of Health Research (EQUATOR) network catalogues a growing number of guidelines for the reporting of health research (https://www.equator-network.org). Many have been widely adopted, with potential benefits including more effective dissemination, translation and implementation of new knowledge and reduction of research waste. Many well-known EQUATOR guidelines focus on standard research methods (eg, Preferred Reporting Items for Systematic Reviews and Meta-Analyses,7 Consolidated Standards of Reporting Trials and8 Strengthening the Reporting of Observational Studies in Epidemiology9), but the bulk of the 400-plus guidelines are discipline specific. However, no guideline focuses directly on the reporting needs of PC.
This study aims to reach a consensus around the potential items for the CRISP guidance statement based on the expertise of the international PC research community. We will use these results to inform the final CRISP guidance to improve the reporting of PC research.
Methods
We used a Delphi survey to reach a consensus on potential items for PC research reporting among the broad international community of producers and users of PC research. A Delphi survey is a consensus building method that gathers opinions from a select group of participants and allows participants to compare their opinion to others in the group via consecutive surveys.10 We chose the Delphi design as the most appropriate consensus-building method as it enables participation by people who are distant in place10 and we published our study protocol online.11 Our reporting is informed by Guidance on Conducting and Reporting Delphi Studies12 and the Checklist for Reporting Results of Internet E-Surveys Checklist for reporting internet e-surveys.13
Delphi panel
We sought to recruit a diverse panel to reflect the nature of the PC research community to represent the producers and users of PC research who bring unique expertise to the subject. This approach differs from most Delphi studies that typically use a small, homogeneous group of experts.10 12 We aimed for 100 participants to include practitioners, researchers, patients and policy-makers from high-income and lower-middle-income countries.
We recruited participants from a list of volunteers from our prior surveys and CRISP activities3 4 as well as our professional networks. We emailed volunteers, inviting them to complete a demographic survey and consent to participate in the online Delphi survey. Based on these survey returns, we identified target groups to guide our further invitation efforts.
We used a purposeful sampling procedure and developed a matrix to stratify targeted characteristics including world regions, demographic factors, healthcare professions, research disciplines, research roles and experience levels. Inclusion criteria required participants to be actively engaged in some aspect of PC, read English well enough to complete the survey, be able to access the online survey and give informed consent. We applied no exclusion criteria.
Delphi survey development
The round 1 survey presented a list of potential reporting items drawn from the results of our prior CRISP research: a needs assessment survey among the international PC research community,4 a survey focusing on the needs of practicing clinicians3 and a scoping review.2 (table 1) In each of the surveys, we asked respondents for their views on what could be improved in PC research reporting and what items are important to include in research reports so that they are useful for their own research and/or clinical practice. We extracted the free-text comments from the two surveys, and EAS, WRP and PP synthesised the comments into an initial list of potential reporting items. The whole CRISP Working Group then reviewed the list, commented on each item and suggested new wording for clarity. We presented this aggregate list of potential items to the Delphi panel in round 1. We pilot tested the survey with the Working Group and colleagues who made suggestions to improve clarity of the potential items and survey instructions.
Participants received an email invitation between May to September 2021. Qualtrics XM software (Qualtrics, Seattle, Washington, USA) was used to provide respondents with online access to the closed Delphi survey using a unique survey code that allowed us to link participant responses between rounds.
The survey presented questions in the same order to all participants (online supplemental appendix 1 and 2). Round 1 presented 29 potential items over 32 pages and required the participant to respond within 4 weeks. Round 2 presented 25 potential items over 29 pages and required participants to respond within 6 weeks. Respondents could review and change answers, and no question forced response to advance the survey. We did not offer any financial incentive for participation, but participants could elect to be named in the acknowledgement of the manuscript (see below).
Supplemental material
Analysis
Round 1 presented 29 potential reporting items to the Delphi participants. We calculated the percentage of participants that voted to include each item, exclude the item or indicated that they were unsure. Only participants who voted to include an item were asked about whether it should be required or recommended. For incomplete surveys, we included questions that were answered in the analysis.
Three investigators (EAS, WRP and PP) reviewed and summarised all comments and presented them to the Working Group, along with the descriptive statistics. Reworded items from round 1 were included in round 2 if they met the protocol criteria, which included at least 50% of participants agreeing that the item should be included (figure 1).11 We chose this relatively low threshold so as not to prematurely exclude items that could be reconsidered by participants after rewording or reflection on the comments from other participants.14
We prespecified that round 3 would only proceed if items had less than 50% of participants agreeing it should be included, plus suggestions for changes in wording.11
Patient and public involvement
We included clinician-researchers in our research team. Researchers, clinicians, educators and patients were involved as participants in the Delphi survey.
Results
Following our prespecified sampling procedure,11 we invited 116 respondents to participate in the Delphi study. Round 1 was completed by 98 (84%). Round 2 was completed by 89 of the original 98 (91%; 77% of those agreeing to participate) (table 2). Panellists were from all five world regions and demographics are detailed in table 2.
Round 1 was completed by 98 people (84% of 116 volunteers), with 96% completing all questions. Tables 3 and 4 lists the levels of endorsement for inclusion, requirement and recommendation for each potential reporting item in rounds 1 and 2, respectively.
Respondents suggested rewording or combining for most items. In addition, they suggested two new items: reporting demographics of participants (table 4, item 21) and theory informing research (table 4, item 16).
For one item in round 1, ‘report or translate measures into forms useful in PC patient care’, 28% were unsure if it should be included (ie, they answered ‘unsure’ in the survey, note that this is not shown in table 3), and participant comments suggested that they did not understand the statement, so the item was substantially reworded for round 2 to be ‘report findings in forms useful to PC clinicians and patients (Examples: number needed to treat, absolute risks instead of just relative risks, etc.)’ (table 4, item 18). A majority (81%) voted to include ‘describe the national and local healthcare system to allow comparison to other systems…’ (table 3, item 11) and 14/15 of those who answered ‘no’ were from North America.
We invited the 98 people who completed round 1 to participate in round 2 and 89 (91%) responded. Round 2 presented 25 items, including the two new items (table 4). Some of the items from round 1 items were combined for round 2: (1) three items (table 3, items 4, 5 and 16) from round 1 were combined into: ‘describe how PC patients, practicing clinicians, community members and other stakeholders were involved in the research process’ and (2) and three items (table 3, items 12, 19 and 26) were combined into ‘describe interventions and their implementation in sufficient detail to allow readers to judge applicability to routine practice in a variety of PC settings’.
There were limited suggestions for rewording in round 2 and no suggestions about adding or combining items. Round 2 results showed only minor changes from round 1, demonstrating Delphi panel consensus on the list of items. Therefore, per protocol,11 we did not proceed to a round 3 (figure 1).
A majority of round 2 respondents voted to include 23 items (90%–100% for 11 items, 80%–89% for 3 items, 70%–79% for 3 items, 60%–69% for 3 items and 50%–59% for 3 items). Among those voting to include items, over 50% voted to require reporting for 11 items. For many items, votes were relatively close between required and recommended (eg, 60:40 or less), with few items showing a strong preference.
Discussion
This Delphi study of the international PC research community reached consensus on potential items for guidance for the reporting of PC research. The study represents the first time that the PC research community has been consulted on this topic. These items highlight the unique needs of PC research and complement the items commonly listed in guidelines developed by other experts for specific research methods and other purposes.15 The process prioritised 21 reporting items for inclusion in reports of PC research, and there was limited support for making items mandatory.
While consensus was reached on all 21 reporting items, participants varied in the strength of their support for individual items. Greatest consensus (≥90% agreement to include) was reached with items relating to research implications, strategies to improve indexing and searching and transferability of study findings. Less support was directed at items that were seen as being more difficult to collect and report, including describing PC teams, relationships among patients–clinicians–researchers, and specific patient demographics. Potential items pertaining to research team background and experience in PC were less well supported. A few comments suggested this information might make some team members feel unwelcome or under-appreciated.
The preparatory programme of CRISP research and its international, interdisciplinary, inclusive approach gave this study particular strengths. Our Delphi panel engaged diverse participants reflecting the breadth of PC and its research enterprise. Consensus across these groups suggests broad agreement on what is important in PC and the research supporting practice, research, education and policy.
The Delphi process is a broadly accepted method for reaching consensus among expert groups.10 It is recommended for the development of research reporting guidelines, though not yet employed by most groups.16 There are multiple approaches to determining the ideal size and composition of Delphi panels.10 Most research reporting guidelines have relied on small homogenous groups of academic experts in research methodology. We elected to engage a large and diverse panel, as we recognise the complexity of PC research and the value of expertise contributed by researchers, clinicians, educators and patients.
Study limitations include the practical adaptations required by the current COVID-19 pandemic. Our authorship team could not meet in person, but a small group (WRP, EAS and PP) had frequent virtual meetings and communicated by email with the Working Group. Most respondents had English as their first language, but there are no indications that the needs differed between English and non-English speaking participants, and the global geographic spread of participants increases confidence that findings can be widely generalised. Also, a large proportion of the respondents were bilingual. There were only four clinicians who did not have another non-clinical professional role. This may represent a missing clinician perspective or more likely the commonality of portfolio careers.17 18
This Delphi study is one of the final steps in the crystallisation of the CRISP reporting checklist for PC research. The Working Group will use the Delphi findings to inform the next steps, which will include a group discussion on the final wording and order of the items and pilot testing of the checklist with diverse groups of researchers with different levels of expertise and experience. The overall vision of CRISP is to improve research reporting in PC to ensure reports are as helpful as possible for researchers, patients, clinicians and policy-makers. This Delphi survey represents an important step on the CRISP journey to providing the support for PC researchers.
Data availability statement
Data are available upon reasonable request. Full data will be shared with researchers on appropriate request to the corresponding author.
Ethics statements
Patient consent for publication
Ethics approval
This study involves human participants and was approved by Monash University Human Ethics Research Committee (26508). Participants gave informed consent to participate in the study before taking part. We protected identified data within password-protected accounts accessible to EAS and PP.
Acknowledgments
We thank the Delphi participants who dedicated their valuable time and expertise to this study. The following agreed to be named in acknowledgements: Amnon Lahad, Bryna McCollum, Melina K. Taylor, Cylie Williams, Ann Kurth, Ian M. Bennett, Suzanne Nielsen, Donald Kollisch, Patricia Sampaio Chueiri, Rita McMorrow, Sharon James, Linda Mccauley, Monika Asnani, Margaret Flinter, Maria van den Muijsenbergh, Constance Dimity Pond, Miguel Marino, Patricia Thille, Mylaine Breton, Erin Wilson, Suzanne H. Richards, Barry Saver, Carlos Roberto Jaén, Jorge Pacheco, Lidia G. Caballero, Susan E. Hansen, Tehzeeb Zulfiqar, Roland GRAD, Cornelia van den Ende, Elizabeth Angier, Amanda L. Terry, David N Blane, Sally Hall Dykgraaf, Kate M Dunn, Yang Wang, Bruce Guthrie, James J Stevermer, Tyler Williamson, Jo-Anne Manski-Nankervis, Adnan Alam, Geoffrey Spurling, Patricia M. Klatt, Lee A. Green, Claire Madigan, Richard Young, Robin Gotler, Chandramani Thuraisingham, Tom Fahey, Glenville Liburd, Kees van Boven, John W. Beasley, Sam Merriel, Leif I Solberg, Karen Cardwell, Mingliang Dai, Musa Dankyau, Cristina LASERNA JIMÉNEZ, Jack Westfall, Jennifer Neil, An De Sutter, Helen Atherton, Ruth Walker, Klaus von Pressentin, Ronny Gunnarsson, Chris Barton, Joanne Enticott, Ai Theng Cheong, Howard A Selinger, Lauren Ball, Carolyn Chew-Graham, Juliana da Rosa Wendt, Eric M. Wall, Rachelle Ashcroft, Jan Radford, Annette Peart, Mary Alice Scott, Gillian Bartlett-Esquilant, Marita Hennessy, James W. Mold, Rebecca S. Etz, Hans Thulesius, Jumana Antoun, Oliver Frank.
References
Supplementary materials
Supplementary Data
This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.
Footnotes
Twitter @LizSturgiss, @BillPhillipsMD, @FrankMoriarty, @joannelreeve, @grantrussell17
Contributors EAS and WRP conceived of the concept. EAS is guarantor for this publication. EAS, WRP, FM, AO, PLBJL and JCvdW all contributed to the methods. EAS and PP set up and distributed the surveys. EAS, PP and WRP performed the initial analysis. EAS, PP, WRP, FM, PLBJL, JCvdW, PG, TCOH, AO, JR, GR and CvW all contributed to the analysis. EAS and WRP wrote the draft manuscript. All authors contributed to the final manuscript.
Funding There was no specific funding for this study. EAS received salary support from an NHMRC Investigator Grant (NHMRC GNT1173011) during part of this work.
Competing interests None declared.
Patient and public involvement Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details.
Provenance and peer review Not commissioned; externally peer reviewed.
Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.