Article Text

Protocol
Reporting of methodological studies in health research: a protocol for the development of the MethodologIcal STudy reportIng Checklist (MISTIC)
  1. Daeria O Lawson1,
  2. Livia Puljak2,
  3. Dawid Pieper3,
  4. Stefan Schandelmaier1,4,
  5. Gary S Collins5,
  6. Romina Brignardello-Petersen1,
  7. David Moher6,7,
  8. Peter Tugwell8,9,
  9. Vivian A Welch7,9,
  10. Zainab Samaan1,
  11. Brett D Thombs10,11,
  12. Anders K Nørskov12,
  13. Janus C Jakobsen12,13,
  14. David B Allison14,
  15. Evan Mayo-Wilson14,
  16. Taryn Young15,
  17. An-Wen Chan16,
  18. Matthias Briel1,4,
  19. Gordon H Guyatt1,
  20. Lehana Thabane17,18,19,
  21. Lawrence Mbuagbaw1,20
  1. 1Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada
  2. 2Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Zagreb, Croatia
  3. 3Institute for Research in Operative Medicine, Witten/Herdecke University, Cologne, Germany
  4. 4Institute for Clinical Epidemiology and Biostatistics, Department of Clinical Research, University and University Hospital of Basel, Basel, Switzerland
  5. 5Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, University of Oxford, Oxford, UK
  6. 6Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada
  7. 7School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Ontario, Canada
  8. 8School of Epidemiology and Public Health, Faculty of Medicine and Department of Medicine, University of Ottawa, Ottawa, Ontario, Canada
  9. 9Bruyère Research Institute, Ottawa, Ontario, Canada
  10. 10Faculty of Medicine, McGill University, Montreal, Quebec, Canada
  11. 11Lady Davis Institute for Medical Research, Jewish General Hospital, Montreal, Quebec, Canada
  12. 12Copenhagen Trial Unit, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark
  13. 13Department of Regional Health Research, The Faculty of Heath Sciences, University of Southern Denmark, Odense, Denmark
  14. 14Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, Bloomington, Indiana, USA
  15. 15Centre for Evidence-based Health Care, Division of Epidemiology and Biostatistics, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa
  16. 16Department of Medicine, Women’s College Research Institute, University of Toronto, Toronto, Ontario, Canada
  17. 17Department of Health Research Methods, Evidence, and Impact, and Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, Ontario, Canada
  18. 18Biostatistics Unit, Father Sean O'Sullivan Research Centre and Centre for Evaluation of Medicine, Saint Joseph's Healthcare Hamilton, Hamilton, Ontario, Canada
  19. 19Population Health Research Institute, Hamilton Health Sciences, Hamilton, Ontario, Canada
  20. 20Biostatistics Unit, Father Sean O'Sullivan Research Centre, Saint Joseph's Healthcare Hamilton, Hamilton, Ontario, Canada
  1. Correspondence to Daeria O Lawson; lawsod3{at}mcmaster.ca

Abstract

Introduction Methodological studies (ie, studies that evaluate the design, conduct, analysis or reporting of other studies in health research) address various facets of health research including, for instance, data collection techniques, differences in approaches to analyses, reporting quality, adherence to guidelines or publication bias. As a result, methodological studies can help to identify knowledge gaps in the methodology of health research and strategies for improvement in research practices. Differences in methodological study names and a lack of reporting guidance contribute to lack of comparability across studies and difficulties in identifying relevant previous methodological studies. This paper outlines the methods we will use to develop an evidence-based tool—the MethodologIcal STudy reportIng Checklist—to harmonise naming conventions and improve the reporting of methodological studies.

Methods and analysis We will search for methodological studies in the Cumulative Index to Nursing and Allied Health Literature, Cochrane Library, Embase, MEDLINE, Web of Science, check reference lists and contact experts in the field. We will extract and summarise data on the study names, design and reporting features of the included methodological studies. Consensus on study terms and recommended reporting items will be achieved via video conference meetings with a panel of experts including researchers who have published methodological studies.

Ethics and dissemination The consensus study has been exempt from ethics review by the Hamilton Integrated Research Ethics Board. The results of the review and the reporting guideline will be disseminated in stakeholder meetings, conferences, peer-reviewed publications, in requests to journal editors (to endorse or make the guideline a requirement for authors), and on the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network and reporting guideline websites.

Registration We have registered the development of the reporting guideline with the EQUATOR Network and publicly posted this project on the Open Science Framework (www.osf.io/9hgbq).

  • statistics & research methods
  • epidemiology
  • education & training (see medical education & training)
http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Strengths and limitations of this study

  • To the best of our knowledge, this is the first study to design an evidence-based tool to support the complete and transparent reporting of methodological studies in health research.

  • This project will help to highlight the current reporting practices of authors of methodological studies to outline a list of key reporting items.

  • The stakeholders recruited for the consensus study will represent a diverse group of expert health research methodologists including biostatisticians, clinical researchers, journal editors, healthcare providers and reporting guideline developers.

  • Our study does not incorporate a blinded consensus process and this may impact the flow of discussions during the conference meetings.

Introduction

Concerns with the quality and quantity of research have sparked interest in the rapidly evolving field which has been called meta-epidemiology, meta-research or research-on-research.1–3 This field of research addresses the entire research process, from question development to design, conduct and reporting issues, and most often uses research-related reports (eg, protocols, published manuscripts, registry entries, conference abstracts) as the unit of analysis. These studies may seek to ‘(1) describe the distribution of research evidence for a specific question; (2) examine heterogeneity and associated risk factors; and (3) control bias across studies and summarise research evidence as appropriate’.4 For the purpose of this project, we will refer to these research outputs as ‘methodological studies’, that is, studies that evaluate the design, conduct, analysis (eg, including bias, statistical plan and methods) or reporting of other studies in health research. This definition does not include statistical methodological studies (eg, studies testing new algorithms or analytical methods, simulation studies) and experimental studies in which the unit of analysis is not a research report. Methodological studies are important because they can identify gaps, biases and inefficiencies in research practices, and propose improvements and solutions.

A PubMed search performed in April 2020 for terms often used to describe methodological studies suggests that the rate of publication of methodological studies has increased over time, illustrated in figure 1.

Figure 1

Trends in methodological studies indexed in PubMed from 2009 to 2019.

In the past 20 years, methodological studies have influenced the conduct of health research by informing many popular practices such as double data extraction in systematic reviews5 ; optimal approaches to conducting subgroup analyses6 ; and reporting of randomised trials, observational studies, pilot studies and systematic reviews7–10 to name a few. Methodological studies have played an important role in ensuring that health research is reliable, valid, transparent and replicable. These types of studies may investigate: bias in research,11 12 quality or completeness of reporting,13 14 consistency of reporting,15 methods used,16 factors associated with reporting practices17 ; and may provide summaries of other methodological studies18 and other issues. Methodological studies may also be used to evaluate the uptake of methods over time to investigate whether (and where) practices are improving and allow researchers to make comparisons across different medical areas.19 20 These studies can also highlight methodological strengths and shortcomings such as sample size calculations in randomised controlled trials,21 22 quality of clinical prediction models,23 and spin and over-interpretation of study findings.24–26 As such, methodological studies promote robust, evidence-based science and help to discard inefficient research practices.27 A draft conceptual framework of the various categories of methodological studies that we have observed is outlined in figure 2. Broadly, some categories of methodological studies include those investigating: bias and spin, methodological approaches to study design or reporting issues.

Figure 2

Draft conceptual framework of categories of methodological studies. CONSORT, CONsolidated Standards Of Reporting Trials.

Despite the importance of methodological studies, there is no guidance for their reporting. Murad and Wang have suggested a modification to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), a widely used reporting tool that is sometimes used for methodological studies because these studies often use methods that are also used in systematic reviews.28 Although a modification of PRISMA may work well for the data collection components of some methodological studies, it would fail to appropriately address the many different types of research questions that methodological studies attempt to answer. For example, if researchers were interested in changes in reporting quality of trials since the publication of the CONsolidated Standards Of Reporting Trials guidelines, they could use an interrupted time-series design. Also, methodological studies that include a random sample of research reports,29 or those structured as before–after designs19 would be a poor fit for the modified PRISMA tool, which is best suited for studies designed in the style of systematic reviews. Likewise, studies in which the unit of analysis is not the ‘study’ require more specific guidance (eg, when investigating multiple subgroup analyses or multiple outcomes within the same study).30 Thus, guidelines for transparent reporting of methodological studies are needed, and this need is widely acknowledged in the scientific community.31 32

Our work will address two main concerns:

  1. There are no globally accepted names for methodological studies, making them difficult to identify. Methodological studies have been called ‘methodological review’, ‘systematic review’, ‘systematic survey’, ‘literature review’, ‘meta-epidemiological study’ and many other names. The diversity in names compromises training and educational activities,33 and it makes it difficult for end-users (eg, clinical researchers, guideline developers) to search for, identify and use these studies.34 35

  2. The reporting of methodological studies is inconsistent, which may relate to differences in objectives, and to differences in transparency and completeness. That is, some studies may be better reported than others. While the most appropriate approach to reporting will depend on the research question, explicit, user-friendly and consensus-based guidance is needed to ensure that methodological studies are reported transparently and comprehensively.36

Aims

The aims of this study protocol are to outline the procedures to define and harmonise the names describing methodological studies, and to develop reporting guidelines for methodological studies in human health research.

Methods and analysis

Study design

We have adopted the strategy for the development of reporting guidelines proposed by Moher et al.37 A visual overview of this approach, highlighting key components of the process, is presented in figure 3. The three parts of the project which will be addressed using the above strategy are outlined in detail below (see online supplemental file for an outline of the data flow informing subsequent parts of the project).

Figure 3

Project overview for the development of reporting guidelines for methodological studies in health research.

Part 1: methodological review

The objectives of this part are to: (a) identify names used to describe methodological studies, (b) identify the various designs, analysis and reporting features of methodological studies, (c) find any previous reporting guidance and (d) identify methodological study experts.

Search strategy

We developed a search strategy informed by our pilot work38 targeting health-related sciences and biomedicine databases: Cumulative Index to Nursing and Allied Health Literature, Cochrane Library, Excerpta Medica (Embase), MEDLINE and Web of Science. There will be no limits by publication year, type or language. We will perform searches for authors known to publish in this field, check reference lists of relevant studies, check existing methodological study repositories (Studies Within a Trial and Studies Within a Review), preprints (bioRxiv and medRxiv), set up Google Alerts for keywords (eg, meta-epidemiology, research-on-research) and contact experts (eg, via email, meetings, following relevant journals, subscribing to methods email newsletters including the Methods in Research on Research and the National Institute for Health and Care Excellence groups, and following researchers on social media platforms such as ResearchGate and Twitter) to identify additional methodological studies. We will also check the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) library to identify any published or under development reporting guidance. These approaches are informed by previous work and published literature.35 38 Two health sciences librarians at the Health Sciences Library (McMaster University) were consulted and reviewed the final search strategy (see online supplemental file) in line with the Peer Review of Electronic Search Strategies framework.39

Eligible studies

Studies that investigate methods—design, conduct, analysis or reporting—in other studies of health research in humans will be eligible. The ‘other studies’ (or research reports) refers to the unit of analysis of the methodological studies (eg, abstracts, cohort studies, randomised trials, registry records, study protocols, systematic reviews). Only published protocols and final reports of studies that investigate methods will be eligible. We will exclude simulation studies, studies testing new statistical methods (ie, there is no specific unit of analysis) and experimental studies of methods (ie, the unit of analysis is not a research report). These sorts of studies either already have reporting guidelines or can be reported in a commentary-style format.

Screening

A team of reviewers led by DOL will screen titles and abstracts independently, in duplicate in Rayyan,40 and full texts in standardised forms in DistillerSR.41 Both are online collaborative platforms for screening and reviewing literature. We will measure agreement on screening and study inclusion using Cohen’s kappa statistic.42 43 Any discrepancies between reviewers will be resolved through discussion.

Data extraction

In order to document the current reporting practices, we will extract data from included studies independently, in duplicate based on a standardised data collection form. Key data extraction fields for documenting methodological study features and reporting practices (eg, study design name, databases searched, any guideline use) are outlined in table 1. All data will be compiled in DistillerSR. Any discrepancies between reviewers will be resolved through discussion.

Table 1

Overview of data extraction fields for the review

All reviewers will undergo calibration exercises and pilot the screening and data collection forms (25 studies per reviewer). We will incorporate an emergent design in the data collection stage of the review, which is characterised by a flexibility in the methodology, allowing researchers to remain open to modifications.44 Should any new information that is of interest arise during the full-text screen or data extraction, we will update the data collection form and collect this information for all studies retrospectively and going forward. Any modifications to the present protocol will be reported in the final published review. This iterative approach will allow for the capture of information as new methodological study design features come to light during the full-text screening and data extraction phases. Based on this approach, data extraction will be updated accordingly for previously reviewed studies as needed. For example, we expect to see overlaps in methodological study names, some of which might be attributed to collaborating research groups. There also appear to be similarities in methodological study reporting styles that are borrowed from systematic review4 or survey study designs, which have both been extensively developed and are omnipresent in health research literature. However, if the current data collection fields, listed in table 1, are insufficient to capture the nuances of the varieties of methodological studies, we will revise our data collection forms accordingly and collect the data for all studies.

Generation of a list of candidate items

The generation of a list of candidate items will be informed by two sources. First, a list of reporting items will be compiled based on what has been reported by authors of the included studies in the methodological review (eg, flow diagram, search strategy). We will also note the use of any reporting guidance as mentioned by authors (eg, PRISMA, STrengthening the Reporting of OBservational studies in Epidemiology (STROBE)). Each item will be ranked from most frequently reported to those less frequently reported. Second, this list will be presented to expert user stakeholders alongside the proportion of methodological studies that report on each item. Stakeholders will be asked to propose additional relevant items to finalise the list of candidate reporting items for part 2.

Data analysis

We will present the flow of articles retrieved and screened in a study flow diagram, and summarise data in tables with explanatory text. We will provide descriptive statistics, that is, counts (percentage) for categorical data, and means (SD) or medians (IQR) for continuous data. In addition to study names, we will synthesise and tabulate verbatim quotations for the study objectives, outcomes, and intended use of findings to provide context and clarification for methodological study rationales.45 We will qualitatively group studies into categories based on similarities in reporting features. All statistical analyses will be done in Stata V.15.1.46 We will identify additional potential stakeholders from the list of authors of included studies.

Part 2: consensus study

This part of the project will consist of consultation with expert user stakeholders in a consensus study. The objectives are to define methodological studies, and outline the recommended study name(s) and best reporting practices. The project steering group (DOL, GHG, LM, LT), which includes members with expertise in health research methods, will oversee the consensus study and development of the reporting guideline.

Identification of stakeholders

The steering group will be responsible for identifying expert user stakeholders based on expertise with methodological studies and expertise with reporting guideline development.47 Additional stakeholders will be identified from the list of authors (either corresponding or senior, with academic faculty status) of methodological studies from the review. In our selection of stakeholders, we will seek individuals who will be committed to participating and providing feedback for the reporting guideline. We define expert user stakeholders as researchers involved in the design, conduct, analysis, interpretation or dissemination of methodological studies. Approximately 20–30 stakeholders will be selected (including the protocol authors) as participants in the consensus exercises. We will track response rates to invitations to participate in the consensus study. We will collect participant demographics (eg, country, primary job title, academic rank, and methodological study publication history) to provide insight into the representation in this field of research based on sociocultural factors.

Measuring agreement and achieving consensus

The above definition of methodological studies (ie, studies that evaluate the design, conduct, analysis or reporting of other studies in health research) will be used during the online consensus exercises and video conference meetings. Participants will discuss the following: (a) names for methodological studies, (b) categories of methodological studies and (c) reporting requirements. These three components, outlined in table 2, will be completed electronically through a McMaster Ethics Compliant service, LimeSurvey (https://reo.mcmaster.ca/limesurvey) for online surveys.48

Table 2

Overview of consensus study activities and expected outputs

All video conferences will be facilitated by two investigators (DOL and LM). Stakeholders will be consulted for the development of drafts, elaborations and explanations for specific items. All steering committee members and stakeholders will be required to participate and vote during the consensus meetings. Disagreements will be resolved through discussion, and if no consensus can be reached, the steering committee will convey the recommendations for the stakeholder group to approve. Zoom, or comparable video conferencing software, will be used to allow for the collection of recordings.49

Data analysis

Findings from the consensus exercise will be summarised descriptively in tables that include counts (percentage) for categorical data, and means (SD) or medians (IQR) for continuous data. We will measure the levels of agreement (ie, percentage increase in agreement for successive rounds, number of comments made for each successive round and rounds with emergence of new themes) and instability (ie, spread and SD of ranked responses for each item) for each round.50 After the online exercises, one investigator (DOL) will qualitatively synthesise and code the suggestions for the methodological study names, categories and reporting items into common themes in Dedoose, a qualitative research software.51 The steering committee will synthesise data from the participant discussions to revise each subsequent draft.

Part 3: reporting guideline

The objectives of this part are to develop, refine, publish and disseminate the reporting guideline for methodological studies. We have registered the development of the reporting guideline— MethodologIcal STudy reportIng Checklist—with the EQUATOR Network.52 This record may see updates to its name and acronym after deliberations during the consensus study. We will also consider which reporting items are appropriate for different categories of methodological studies. This will include discussions about whether a decision tree may be useful to direct users to other existing reporting guidelines should they be more appropriate for specific categories of methodological studies (eg, STROBE for methodological studies designed as cohort studies). Quantitative and qualitative findings from the consensus study will be incorporated into the final guideline document to include the: (a) recommended methodological study name(s) and categories, (b) recommended checklist with agreed on reporting items, (c) user guide and elaboration (eg, an explanation of why it is important, rationales and an example of how it can be presented in a methodological study), and (d) consensus statement. The draft document will be returned to the steering group and stakeholders to collect additional feedback. The checklist will be tested with end-users for face validity and clarity, and for additional fine-tuning as needed prior to publication. We will distribute the finalised checklist to a group of authors of methodological studies identified from the review (part 1) to assess its usefulness and whether the checklist appropriately captures items relevant to the reporting of methodological studies.53

Patient and public involvement

Although patients and the general public are not directly involved in this project, the findings of this research will be relevant to a broad range of knowledge users including methodological study authors, health researchers, methodologists, statisticians and journal editors. We will seek recommendations from investigators for general public members and patients that could be recruited for this project.

Ethics and dissemination

This research has received an exemption (October 2019) from the Hamilton Integrated Research Ethics Board for the consensus study. Ethics committee approval and consent to participate is not required for any other component of this project since only previously published data will be used.

Data deposition and curation

All participant records and data will be stored in MacDrive, a secure cloud storage drive that is privately hosted and based in-house at McMaster University.54 Only two researchers (DOL and LM) will have direct access to study-related documents and source data. Qualitative data will be promptly coded and transcribed, and all audio files will be encrypted. As part of our knowledge translation (KT) strategy and a consequence of the difficulties we faced in retrieving methodological studies from literature databases during our pilot work, we have developed an open-access database of methodological studies (www.methodsresearch.ca). We will catalogue all included studies from the pilot and full reviews on this website such that end-users can easily retrieve these studies. We have also set up a submission portal for researchers to submit their studies to be catalogued in this database. Parallel research by our colleagues will use this database as well as explore the automation of retrieving and indexing methodological studies in a dedicated space.55 Lastly, we will set up a complementary website to serve as the primary repository for the published reporting guideline document.

Dissemination

We will publish all manuscripts arising from this research and present the findings at conferences. We will set up a complementary website to serve as the primary repository for the published reporting guideline document. The inclusion of knowledge users and representatives from methodology journals and guideline groups on our core study team will aid the wide dissemination of the reporting guideline. We continue to contact journal editors for their endorsement, and encourage researchers to reach out to us about this work, as we have done previously.34 We will also encourage user feedback to inform future updates of the guideline as needed. These approaches are informed by our collective experience in developing and disseminating health research guidelines.7 56–60

Discussion

Our work is contributing to reducing research waste by: (1) making methodological studies transparent through streamlining their reporting; (2) permitting researchers to appraise methodological studies based on adherence to proposed guidelines; (3) allowing end-users of methodological studies to be able to locate inaccessible research in a dedicated database and promoting its continued development; and in doing so (4) allowing end-users of methodological studies to better evaluate and identify issues with study design and reporting that influence patient health, enabling them to apply methodological study evidence to their own research practices. Many methodological studies are done to improve the design, conduct, analysis and reporting of primary and secondary research. We anticipate that, in reviewing this body of evidence on research methods, we will further highlight the importance of studies that aim to improve the design of health research.61

Strengths and limitations

We acknowledge that there are inherent challenges in the search and retrieval of studies that lack consistent names, or dedicated indexing in common health research databases. As such, it is plausible that certain methodological studies that use terms not previously identified in the pilot or from our systematic database searches may be missed. To mitigate this limitation, we will (and have already) contact(ed) experts in the field to identify additional studies, and screen references and cite articles of relevant studies. We have consulted extensively with librarians at the McMaster Health Sciences Library on optimal approaches to capture the maximum number of studies.

The uncertainty in the number of methodological studies that are currently available and published in the literature can present additional logistic and timing constraints to the review component and overall progress of this work. However, given the landscape of methodological studies, we believe it is essential to apply a comprehensive search. To help with the organisation of screening and data extraction, we will use robust systematic review management software (DistillerSR).41 Further, we have designed all screening and data extraction prompts to ensure consistency and replicability of our work.

Lastly, our study does not incorporate a blinded consensus process and this may impact the flow of discussions during the video conference meetings. We will aim to regulate discussions such that dominant speakers do not steer the discussion and ensure that all participants have a chance to speak. Additionally, we will share summaries of the discussion and decisions after the meetings. This will allow for participants to privately provide any additional written feedback to the steering group that may not have been addressed.

A key strength of this research is the diversity of our study team. We have brought together an international, multidisciplinary team with expertise in consensus activities and guideline development, and research methodology and synthesis. This gives us an advantage in the breadth of feedback and fruitful discussions to be had with a wide array of users of the forthcoming guideline. Given the rise in the conduct of methodological studies, a general call for guidelines in the scientific community, and the number of teams that have reached out to us with interest in participating in this work, we are confident that the guideline will be used. However, we fully acknowledge the factors associated with implementation and use of guidelines, notably journal endorsement of the guidelines, the passage of time and other study level characteristics.20 62–66 Therefore, our stakeholders include editors from key journals that publish methodological studies such as the Journal of Clinical Epidemiology, BMC Medical Research Methodology, BMC Systematic Reviews, The Campbell Collaboration, and Cochrane. Stakeholders also include representatives from academic programmes building capacity, at the master’s and doctoral level, in conducting methodological research. To encourage better uptake, it has been suggested that researchers should work collaboratively with journals in the prospective design, knowledge translation and evaluation of reporting guidelines,67 as well as following up on user feedback and incorporating a system to revise the reporting guidelines when necessary.68 These strategies have been incorporated in our KT plan.

Conclusions

This research will improve the transparency of reporting of methodological studies, and help streamline their indexing and easier retrieval in literature databases. This work stands to make a substantial impact by informing research reporting standards for studies that investigate the design, conduct, analysis or reporting of other health studies, and thereby improving the transparency, reliability and replicability of health research, and ultimately benefiting patients and decision makers. Future efforts will focus on field-testing the published checklist with authors of methodological studies, gathering feedback from end-users, and optimising and adapting the checklist for different typologies of methodological studies as needed.

Acknowledgments

We would like to thank Denise Smith and Jack Young at the Health Sciences Library at McMaster University for their critical review of the search strategy. We would like to thank Dr Susan Jack at the School of Nursing at McMaster University for sharing her expertise and recommendations on qualitative and mixed-methods research design.

References

Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

Footnotes

  • Contributors DOL and LM conceived the idea. DOL, GHG, LM and LT contributed to the design of the study. DOL wrote the first draft of the manuscript. AKN, A-WC, BDT, DBA, DM, DP, EM-W, GHG, GSC, JCJ, LM, LP, LT, MB, PT, RB-P, SS, TY, VAW and ZS contributed to the refinement of the study methods and critical revision of the manuscript. All authors read and approved the final version of the manuscript.

  • Funding This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors. This project is being carried out as part of a doctoral dissertation by DOL. DOL is supported by an Ontario Graduate Scholarship and the Queen Elizabeth II Graduate Scholarship in Science & Technology, and the Ontario Drug Policy Research Network Student Training Program.

  • Disclaimer The funders had no influence over the design, data collection, analysis, interpretation, preparation of or decision to publish the manuscript.

  • Competing interests ‘Yes, there are competing interests for one or more authors and I have provided a Competing Interests statement in my manuscript and in the box below’.

  • Patient consent for publication Not required.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.