Article Text

Original research
Exploring the development of a framework of social accountability standards for healthcare service delivery: a qualitative multipart, multimethods process
  1. Alex Anawati1,2,
  2. Erin Cameron3,
  3. Jacqueline Harvey4
  1. 1Clinical Sciences, Northern Ontario School of Medicine, Sudbury, Ontario, Canada
  2. 2Emergency Department, Health Sciences North, Sudbury, Ontario, Canada
  3. 3Human Sciences, Northern Ontario School of Medicine University, Thunder Bay, Ontario, Canada
  4. 4Northern Ontario School of Medicine University, Sudbury, Ontario, Canada
  1. Correspondence to Dr Alex Anawati; aanawati{at}nosm.ca

Abstract

Objectives Social accountability is an equity-oriented health policy strategy that requires institutions to focus on local population needs. This strategy is well established in health professional education, but there is limited understanding of its application in healthcare service delivery. Building on what is known in the education setting, this study aimed to explore the development of a framework of comprehensive, evidence-based social accountability standards for healthcare service delivery institutions.

Design This qualitative, multipart, multimethods study consisted of a modified Delphi process guided by an evidence-based social accountability tool for health professional education and complementary methods including developmental evaluation and a review of select literature to capture emerging evidence and contextual relevance.

Setting The study took place in Northern Ontario, Canada at a medical school and a tertiary, regional academic health sciences centre that are both grounded in social accountability.

Participants Eight expert participants from diverse, multidisciplinary backgrounds, including a patient advocate, were purposefully recruited from both institutions, enrolled and seven completed the study.

Main outcome The resulting framework of social accountability standards is organised into 4 major sections that capture broad and critical concepts; 17 key component reflective questions that address key themes; 39 aspirations that describe objective standards and 197 indicators linked to specific expectations.

Results Three modified Delphi rounds were completed producing a framework of consensus derived standards. Developmental evaluation helped identify facilitators, barriers and provided real-time feedback to the study’s processes and content. The literature reviewed identified 10 new concepts and 43 amendments.

Conclusion This study highlights the development of a comprehensive, evidence-based framework of social accountability standards for healthcare service delivery institutions. Future studies will aim to evaluate the application of these standards to guide equity-oriented social accountability health policy strategies in healthcare service delivery.

  • Health Equity
  • HEALTH SERVICES ADMINISTRATION & MANAGEMENT
  • Health policy
  • MEDICAL EDUCATION & TRAINING
  • SOCIAL MEDICINE
  • QUALITATIVE RESEARCH

Data availability statement

Data are available on reasonable request. Portions of the study’s data can be made available for review on request. For all inquiries for data access, please email aanawati@nosm.ca or ercameron@nosm.ca.

http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

STRENGTHS AND LIMITATIONS OF THIS STUDY

  • This study’s multipart, multimethods design allowed expert opinions to be captured, compared and validated along with real-time feedback on the study’s processes and content.

  • The study was intentionally designed to be consistent with social accountability strategies in health professional education.

  • Expert participants from multidisciplinary backgrounds were purposefully recruited, including a patient advocate, from both a medical school and a tertiary, regional academic health sciences centre that are both grounded in social accountability.

  • The highly focused area of study resulted in a small participant group and benefited from the inclusion of motivated, content expert ‘insider-researchers’ to organise the methods and results from a valid social accountability determinate point of reference.

  • The modified Delphi process engagement was negatively impacted by the COVID-19 pandemic; however, almost all participants reviewed the final iteration of standards and participated in the final vote.

Introduction

Social accountability is an attractive health policy strategy for healthcare service delivery institutions such as hospitals and medical clinics. It has been successfully advocated for on moral and ethical grounds and for the promising possibility of improving equity-oriented and local, context-specific outcomes.1–16 Social accountability’s allure as a health policy strategy has led to it being identified as an important part of academic health sciences centres’ quadripartite missions,2 as a key consideration for emergency departments,1 as critical in continual quality improvement and health learning systems,3 and as a strategy to better address community needs in primary care.7 9 11–13 15 Increasingly, social accountability is becoming a core mandate in strategic plans within academic health science centres and primary care networks.10 17

Despite the aforementioned, social accountability as a health policy strategy remains enigmatic, difficult to define and apply, and is inconsistently implemented.4 8 There is evidence that social accountability in healthcare service delivery can include everything from very specific interventions such as screening for poverty or practising trauma informed care,15 to using socially accountable community score cards in maternal and reproductive health.5 6 It is also simultaneously described as a much more comprehensive strategy that can be articulated across micro, meso and macro levels of care.9 11 13 15 This has led not only to confusion about social accountability in healthcare service delivery settings, but also a paucity of clear evidence on how to implement it as a comprehensive health policy strategy.

Looking to the different, but linked environment of academic institutions and health professional education—this is not the case. In health professional education, social accountability is well defined by the WHO as the obligation to be accountable to society by directing activities (education, research and service) towards local priorities.18 At its core, social accountability is a feedback loop of coidentifying priorities with community, making changes and then assessing impact.18–20 Integral to a social accountability strategy is meaningful engagement with different partner groups (academics, health administrators, policy-makers, health professionals and linked sectors);19–21 along with the values of relevance, quality, cost-effectiveness and equity.18–20 Social accountability requires community contextualisation, anticipation of needs and validation with community that their needs are met, rather than an awareness (social responsibility) or reaction (social responsiveness) to community needs.20

In health professional education, social accountability is widely implemented as a health policy strategy. It is supported by a global consensus,22 23 it is the mandate of academic institutions in multiple countries24 and is recognised as an accreditation standard in Canada.25–27 More importantly, there are multiple tools that help academic institutions implement it as a health policy strategy.18 23 24 28–35 One of the tools developed for academic institutions is the Social Accountability Framework for Health Workforce Training (SAFHWT).24 30 33 This is an internationally validated tool that was developed using a modified Delphi process. Education programmes that adhere to these standards show positive impacts on graduate retention,36 37 economic stimulation,38–40 graduate’s orientation towards generalist disciplines,41 graduate tendency to practice in smaller and lower income communities,42 and strengthening of community healthcare services.43

To expand the impacts of social accountability beyond academic institutions and health professional education, there is a need for similar tools that support the implementation of social accountability as an equity-oriented health policy strategy in the different settings of healthcare service delivery. To fill this gap, this study aimed to explore the development of a framework of comprehensive, evidence-based social accountability standards for healthcare service delivery contexts.

Methods

Research team

This study was developed and led by an emerging clinical researcher (AA) and mid-career education researcher (EC). Together, they brought expertise in social accountability, healthcare service delivery and qualitative research methodologies.

Design

The study design consisted of a modified Delphi process, a review of literature identified by the Delphi expert panel and developmental evaluation (DE) (figure 1). In total, three successive modified Delphi rounds were conducted, each one progressively focusing the framework of social accountability standard’s suitability to healthcare service delivery. DE was used to capture important data beyond the purpose of the modified Delphi process and ultimately informed and identified additional literature sources to review, which was done after the second modified Delphi round. Following the third modified Delphi round, consensus was decided by majority participant vote resulting in the main outcome of the study. This design took into consideration how to manage a modified Delphi study in complex and low-resource settings.

Figure 1

Study design. Multipart, multimethod study design that includes an expert panel undertaking a modified Delphi process, developmental evaluation and review of select literature identified by the expert panel. SAFWHT, Social Accountability Framework for Health Workforce Training Tool.

Setting

The research team was based at the Northern Ontario School of Medicine University (NOSMU) and Health Sciences North (HSN). NOSMU is an international leader in socially accountable medical education,44 and HSN is an academic health sciences centre prioritising social accountability through its strategic plan.10

Participants

Given the access to local, equity-oriented and content-specific expertise about social accountability in healthcare, the study purposefully recruited, through professional networks, eight expert participants with multidisciplinary backgrounds from Northern Ontario with affiliations to NOSMU and/or HSN including a patient advocate. Emphasis was placed on the identification, recruitment and selection of an expert panel to reflect a niche of important local contextual knowledge with applying social accountability in both education and service delivery contexts and exposure to broader knowledge of social accountability in other global contexts through professional networks. Given these requirements, only a small number of participants could be identified that suited the study’s objective. The expert panel size, although small, was consistent with the criteria of more than seven participants as the minimum group size for reasonable reliability of a Delphi Study.45 46 Participant inclusion criteria reflected: (1) place or affiliation to NOSMU or HSN in Northern Ontario; (2) expertise in social accountability or related field in healthcare service delivery and/or health professional education and (3) commitment to the project in terms of availability, interest and time. Participant characteristics including age, gender, geographical location and professional roles were collected.

Given the need for global expertise but local contextual knowledge, the research team also doubled as participants and insider researchers.47 48 The benefits of having researchers as participants in this specific study design included having a shared professional identity, deep understanding of the content language and a shared experiential base with the expert panel members. This was conceptualised to benefit participant recruitment, enrolment and obtaining a greater richness of data. The benefits were also understood as bringing content expert knowledge to a complex and niche subject matter; better rapport and acceptance in leading an expert panel through the modified Delphi process; and, commitment to complete a complex study. To mitigate the risks of confirmation and design bias, participant data were anonymised and the researchers were blinded during data analysis; participant researchers engaged in self check-ins to ensure they respected the confines of the study’s design and methodology; notes of their activities were kept; and, feedback from participants was consistently solicited and reviewed about the process and evolving framework of social accountability standards.

Patient and public involvement

One participant was recruited and enrolled as a patient and family advocate from the academic health sciences centre. They were provided an opportunity for input into the study protocol, participated fully in the modified Delphi process, reviewed study results and were offered authorship. Results of this study will be shared with all participants.

Modified Delphi process

Based on known social accountability tools and frameworks in health professional education,18 23 28–31 35 49 THEnet’s SAFHWT (https://thenetcommunity.org/the-framework/) was identified by the research team as the only comprehensive and validated tool with a framework of standards that could inform this study’s modified Delphi process. The SAFHWT standards served as the starting point to inform participants of the concepts, theme and expectations when exploring the development of a framework of social accountability standards for the different, but linked context of healthcare service delivery.24 30 33 In 2013, The SAFHWT was developed through transnational collaboration with 27 experts with a multistep research methodology similar to a modified Delphi process. It serves as a key tool for health professional training throughout THEnet’s global network.30 33 The SAFHWT is organised into four major sections that capture broad and critical concepts; 21 key component reflective questions that address key themes; 55 aspirations that describe objective standards and 163 indicators linked to the specific, expected actions of a socially accountable health professional academic institution.

Similar to other studies that have developed social accountability standards,23 50 a modified Delphi process was chosen for this study. Modifications included a smaller, more focused expert panel and the integration of DE data. The modified Delphi process was led by AA who oriented the expert panel to the SAFHWT and the modified Delphi process via written instructions and one-on-one sessions. Three modified Delphi rounds were conducted by email between July 2019 and December 2021, with a 5-month study interruption due to COVID-19. Members of the expert panel completed their work independently for each Delphi round and were instructed to vote in the following way: (A) accept the standard without modification, (B) accept the standard with modification (suggest modifications), (C) reject the standard or (D) suggest new standards. Their votes and qualitative responses (ie, modifications to existing or newly suggested standards) were submitted to a research assistant (RA) on standardised forms, using Microsoft Word and Excel Tools, who anonymised data. Data were synthesised and the framework of standards was updated based on the expert panel member votes simply by continuing to include, removing, modifying or adding new content to the framework of standards. If there were any incongruencies across the votes, a simple majority was used. Feedback was provided for each round based on results from the modified Delphi process and integration of DE data: round 1— relevance, experts were asked to revise standards to reflect a healthcare service delivery setting; round 2—clarity, experts were asked to revise standards to improve succinctness and reduce redundancy and round 3—utility, experts were asked to revise standards to improve flow and format.

For each round, the total number of standards that were (1) accepted, (2) newly proposed and/or modified, (3) rejected and (4) missing feedback and, the total number of standards by major sections, key components, aspirations and indicators were tracked across the expert panel. The number of edited and newly proposed standards were grouped together for practicality. It was predetermined that observing an increasing trend in the total number of standards that were accepted and a declining trend in the total number of standards that were modified or newly proposed, rejected and missing feedback would trigger consideration for an expert panel consensus vote for approval of the final framework of standards.

Developmental evaluation

The DE51–53 was led by EC and interwoven into each step of the study’s design including using DE data to further inform the modified Delphi process, as well as, sourcing feedback on the study’s processes and content from the expert panel. The main objectives of DE were to explore process and content facilitators and barriers and to help inform ongoing refinements of the standards.

DE data consisted of reflective notes made by the expert panel members, research team communications (team meetings, agendas and notes), and ‘reflective check-ins’ after each modified Delphi round (emails, interviews). In these check-in’s panel members were asked to reflect on what went well, what to change and what’s next. Additional questions helped panel members to consider the contextual factors influencing the standards and tool development (box 1). Data were captured through detailed research notes by the DE lead, transcribed into Microsoft Excel and underwent a content analysis by the DE lead and an RA.

Box 1

Developmental evaluation reflective questions

What do we need to pay attention to? What do we need to learn?

  1. What opportunity are we trying to address?

  2. What are the key drivers around this initiative?

  3. What resources do we have to work with?

  4. What are the leverage points?

  5. What are the potential challenges, gaps and road blocks?

  6. Who are the key stakeholders and what are their roles?

  7. What expectations, interests and assumptions do these stakeholders have and what is their level of interest/influence?

  8. What is their tolerance for risk and failure?

  9. How does the group (the expert panel) make decisions?

  10. What are the power dynamics among the group?

  11. What are the strengths and weaknesses of the group?

Preliminary DE data were used to focus the modified Delphi process after each round, and to adjust the study’s processes and content in real time. At the completion of the study, DE data were summarised through content analysis,54 using a process of open coding (to identify categories and subcategories) and axial coding (to derive meaning across and between the categories). The DE lead and RA coded the material independently and then met to finalise the results.

Literature identified to consult by expert panel

As a key theme that emerged from DE, the expert panel referred to literature pointing at gaps in the framework of standards. After the second modified Delphi round, led by a final year medical student (JH), the literature identified by the expert panel was reviewed by the research team. Sources were included based on agreed criteria (language: English; years: 1995–2020; location: all geographies considered; format: all formats considered) and based on relevancy and applicability driven by three major questions: (1) Is this article used to guide health professional education and/or healthcare service delivery? (2) Does the article represent a milestone in the evolution of social accountability? and (3) Can the article apply to the context of healthcare service delivery?

Core concepts were extracted and mapped to this study’s social accountability standards. Gaps were identified and described. Redundant gaps and those that did not apply to developing a framework of standards were excluded. Remaining gaps were grouped and described as new concepts, which were integrated into the third modified Delphi round’s framework of standards by the Delphi lead. The total number and a description of amendments were recorded.

Final consensus vote

Trending Delphi data and the expert panel’s capacity for additional rounds triggered a vote on the final iteration of standards. Expert panel members were emailed a survey asking them to indicate whether they agreed or did not agree that the standards in each of the evaluation tool’s four major sections were ready for pilot testing and asked for participant’s comments. Consensus was defined as simple majority.

Statistical analysis

Descriptive statistics were used to present summarised results of participant data, each modified Delphi round, the review of literature identified by the expert panel and the final consensus vote.

Results

Study participants

Participation rates and participant characteristics are listed in table 1. Participation rates expectedly declined with each modified Delphi round, which is common but also reflected the significant impact of the COVID-19 pandemic.

Table 1

Participant characteristics and participation rates

Modified Delphi process

Over the three modified Delphi rounds, the number of standards accepted without changes trended up; the number of new or edited, rejected and standards with missing feedback trended down. In the third iteration, the total number of standards remained consistent with the SAFHWT (online supplemental figures 1 and 2).

DE: language, context and engagement matter

At the completion of the study, content analysis of DE data identified three main themes across all rounds of the modified Delphi process. All expert panel members identified the need to focus on language that defines important differences between health professional education and healthcare service delivery, and the need for a framework of standards that uses relevant terminology. Context matters was also a core theme identified in both the study’s setting and the need for a framework of standards to allow for contextual differences. Expert panel members brought the context of living through the COVID-19 pandemic to the development of the framework and expressed a lack of time to commit to the modified Delphi process. The DE provided them with a way to express the contextual factors influencing all aspects of their lives. Lastly, engagement was identified referring to being engaged or disengaged through the modified Delphi process and the need for continued engagement with key leaders, organisations and knowledge users to ensure the future, successful implementation of this framework of standards.

DE also identified specific modifications, which informed alterations to the study’s processes and content in real time as the study progressed. This included: the need for a glossary of terms; the need to consult additional literature identified by the expert panel (ie, other tools); alterations to the timeline; gaps in the proposed framework of standards; and, measures to address barriers while leveraging key drivers and facilitators towards the study’s primary outcome. As the study progressed, this real-time feedback helped the research team modify its processes to better meet the needs of the participants, to achieve the study’s objectives and resulted in more robust outcomes namely the proposed framework of social accountability standards for healthcare service delivery and process factors beyond the standards that can inform implementation.

Literature identified to consult by expert panel: gaps and concepts

Sixteen articles were selected for review and comparison with the developed framework of standards. Summary findings are outlined in table 2, online supplemental figure 3 and online supplemental table 1. Twenty-five gaps were identified and described. After excluding redundant gaps and those that did not apply to developing social accountability standards, 10 new concepts were integrated during the third modified Delphi round with expert panel member feedback and contributed 43 amendments to the framework of standards. All the core concepts identified in the review of literature were mapped to the proposed framework of social accountability standards.

Table 2

Summary of new concepts and description of amendments made based on the review of literature identified to consult by the expert panel

Final consensus vote

Six out of seven (86%) expert panel members voted. All voting expert panel members (100%) approved the final framework of social accountability standards for pilot testing. No participant comments were received.

Main outcome: framework of social accountability standards

The main outcome of this study explored and demonstrated that a framework of social accountability standards can be developed for healthcare service delivery institutions (ie, hospitals). This study led to the development of a framework of standards that are organised into 4 major sections, which are divided into 17 key components, 39 aspirations and 197 indicators. Major sections represent key social accountability concepts. Key components are reflective questions that subdivide the major sections into core themes. Aspirations are linked to key components and represent focused objectives, which are supported by indicators that outline specific expectations. For select examples from the framework of standards, please see online supplemental table 2.

Creating a continuum: linking academic and service delivery institutions

The study also identified that social accountability can be shared between the different, but linked environments of education and healthcare service delivery and preserved established social accountability concepts, themes and expectations that exist for health professional education programmes (ie, medical schools). In comparing the developed framework of standards to existing standards in health professional education, the preservation of concepts, themes and expectations was evident in both the structure and content. Standards that reflected the shared operational goals between these two environments required only minor alterations such as those relating to research, education and service delivery; identifying communities served; and, intentions to address priority needs. In contrast, the most divergent areas were clustered around indicators that reflected the operational differences between these two environments and required restructuring or new standards. For example, some standards for health professional education programmes identify training a future health workforce through a comprehensive admission process and curriculum. For healthcare service delivery institutions, this concept resonated more with human resource planning and maintaining a health workforce that can deliver the necessary healthcare services. Lastly, as the study evolved, standards became more aligned with the key operations of a healthcare service delivery institution.

The preservation of structure and content, along with the shared and divergent operational goals of these two environments underline the possibility of a social accountability continuum that includes health professional education programmes, their graduate’s and institutions responsible for the delivery of healthcare services. These institutions working together with shared and linked equity-oriented social accountability strategies could act as catalysts for a more robust mechanism to transform healthcare systems towards equity-oriented outcomes that focus on local population needs (figure 2).

Figure 2

Social accountability continuum. Illustration of how social accountability as a health policy strategy could transform healthcare systems when imbedded in the different, but linked environments of health professional education and healthcare service delivery.

Discussion

Principal findings

There are two notable findings discovered while exploring the development of social accountability standards for healthcare service delivery. The first and primary outcome is that social accountability can be a shared obligation between the different, but linked environments of health professional education and healthcare service delivery setting the stage for transformation of healthcare systems towards social accountability and equity-oriented outcomes that focus on local population needs. Second, that an educational framework for social accountability (ie, the SAFHWT) can be adapted and can be a determinate point of reference for social accountability frameworks in healthcare service delivery.

Limitations

There were two notable limitations in the study. First, the expert panel for the modified Delphi Process can be perceived as both a limitation and a strength. Given the highly focused, niche area of study, the inclusion of immersed, motivated and content expert ‘insider researchers’ with well situated subject knowledge was necessary to organise the methods and results from a valid social accountability determinate point of reference. This raises the possibility of confirmation and design bias; however, steps were taken to mitigate these risks in the study’s methods. Similarly, participants were purposefully recruited to prioritise experts with the diversity, experiential depth and unique knowledge of social accountability in health professional education and healthcare service delivery from two institutions grounding themselves in the concept. Prioritising a small, local, expert panel carries the risk of selection and respondent bias, but also results in a panel with the characteristics to achieve the study’s objectives. Second, the COVID-19 pandemic affected the study in many ways, specifically with regard to the Delphi process engagement. Despite this, almost all participants reviewed the final iteration of standards and participated in the final vote. Although the focus of this study was to explore how social accountability in health professional training programmes could inform service delivery settings, in future research, it may be worth considering other existing, alternative frameworks.

Strengths

The study’s design allowed expert opinions to be captured, compared and validated in a timely, responsive and controlled manner through a structured modified Delphi process. Additionally, real-time feedback captured through DE allowed for the integration of DE data to not only help inform the modified Delphi process, but also allowed for real-time adaptations of the study’s processes and content ensuring the study’s success. All standards were mapped to core concepts from the additional literature identified to consult by the expert panel through DE, which helped to resolve a relatively small number of gaps. The critical concepts, themes, objectives and expectations for social accountability in healthcare service delivery remained consistent with those for health professional education found in the SAFHWT.

Comparison with other studies

There are no studies that the authors are aware of that have developed a comprehensive, evidence-based framework of social accountability standards for healthcare service delivery institutions (ie, hospitals) that extend from existing social accountability concepts, themes and expectations for academic institutions (ie, medical schools). This study demonstrated that the core idea of social accountability is applicable and relevant to a healthcare service delivery setting supporting the moral and ethical arguments and growing calls-to-action. Noting, that social accountability is a shared obligation between the different, but linked environment of health professional education and healthcare service delivery settings adds to the established knowledge base of social accountability strategies in health professional education and provides a practical means to implement a shared equity-oriented health policy strategy that focuses on local population health outcomes.

In light of this, there are important definitional changes needed to, for example, the frequently referenced WHO definition for the social accountability of medical schools, to better account for the contextual difference within healthcare service delivery settings. This study proposes the following definition for social accountability in healthcare service delivery:

… the obligation for health institutions to direct their health care services, research and education towards addressing the priority health needs, social needs and health inequities of the patients, populations, communities and region they are mandated to serve, and that this obligation extends to the needs of those who are marginalized, underserved and who experience inequity. Priority needs must be identified in partnership with key stakeholders, through meaningful community engagement and be guided by the values of relevance, quality, cost-effectiveness and equity.55

Conclusion

Accounting for the limitations and strengths of the study’s design, the proposed framework of social accountability standards for healthcare service delivery are comprehensive, evidence-based and in-step with the social accountability of health professional education programmes. This framework of standards will be included in the SAFE for Health Institutions Project’s toolkit, which is intended to be a practical means that can help healthcare service delivery institutions, health administrators, policy-makers and clinicians implement social accountability as an equity-oriented health-policy strategy that focuses on local population needs. These standards will next be piloted and further explored in the setting of an academic health sciences centre’s emergency department.

Data availability statement

Data are available on reasonable request. Portions of the study’s data can be made available for review on request. For all inquiries for data access, please email aanawati@nosm.ca or ercameron@nosm.ca.

Ethics statements

Patient consent for publication

Ethics approval

This study involves human participants and was approved by Health Sciences North Research Ethics Review Board, Project #019-016. Participants gave informed consent to participate in the study before taking part.

Acknowledgments

Northern Ontario School of Medicine University,

Northern Ontario Academic Medicine Association,

Health Science North and Health Sciences North Research Institute. Hafsa Bohonis for her assistance with the project.

References

Supplementary materials

Footnotes

  • Twitter @alexanawati

  • Contributors AA acted as the principal investigator (PI) and colead for all aspects of this study. The SAFE for Health Institutions project was conceived by AA who was responsible for all administrative aspects of the study; established all protocols; secured a grant through the Northern Ontario Academic Medicine Association (NOAMA); submitted the ethics application to Health Sciences North (HSNRI) Research Ethics Board (REB); recruited and oversaw research participants; oversaw the hiring and work of a research assistant (RA); participated in the modified Delphi process; oversaw the collection of participant data and feedback from the modified Delphi process; worked closely with participant data to aggregate all Delphi data and feedback into subsequent iterations of the SAFE for Health Institutions Evaluation Tool’s social accountability standards; reviewed and analysed the data from the modified Delphi process; provided oversight of a summer research student who undertook the review of literature identified by the expert panel; reviewed and analysed the data from the review of literature identified by the expert panel; participated in developmental evaluation; reviewed the developmental evaluation data; drafted the current manuscript; reviewed feedback from other authors and provided revision; has approved the final version of the manuscript; agrees to act as a guarantor to the work; and, is the corresponding author. EC acted as the main Co-Investigator (Co-I) and co-lead for all aspects of this study. EC was the main Co-I who substantially helped to further develop the idea for the SAFE for Health Institutions Project and its protocols; reviewed and revised the NOAMA grant application; reviewed and revised the ethics application; assisted in the recruitment and oversight research participants and the work of the RA; participated in the Delphi process; assisted with the collection of participant data and feedback from the modified Delphi process; reviewed and analysed the data from the modified Delphi process; reviewed and analysed the data from the cross-section review of literature; acted as the lead for developmental evaluation methods, data collection and analysis; assisted in drafting the current manuscript; reviewed feedback from other authors and provided revisions; has approved the final version of the manuscript; and agrees to act as a guarantor to the work. JH acted as a research team member and participated in all aspects of this study. JH reviewed and provided feedback for the study's protocols; reviewed the NOAMA grant application; reviewed the ethics application; participated in the Delphi process; reviewed and analysed the data from the modified Delphi process; led the review of literature identified by the expert panel; was a recipient of NOSMU's Summer Studentship Grant; provided a summary of findings for the review of literature identified by the expert panel; participated in developmental evaluation; assisted in drafting the current manuscript; reviewed feedback from other authors and provided revisions; has approved the final version of the manuscript; and agrees to act as a guarantor to the work.

  • Funding This work was supported by CAN$50000 funding from the Northern Ontario Academic Medicine Association (NOAMA), CAN$7800 in funding from the Sudbury Emergency Associates Local Education Group and CAN$6600 in funding from the Northern Ontario School of Medicine University's Summer Studentship Programme.

  • Competing interests None declared.

  • Patient and public involvement Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.