Article Text
Abstract
Introduction Implementation researchers could draw from participatory research to engage patients (consumers of healthcare) in implementation processes and possibly reduce healthcare disparities. There is a little consumer involvement in healthcare implementation, partially because no formal guidance exists. We will create and pilot a toolkit of methods to engage consumers from the US’ Veterans Health Administration (VHA) in selecting and tailoring implementation strategies. This toolkit, Consumer Voice, will provide guidance on what, when, where, how and why an implementer might engage consumers in implementing treatments. We will pilot the toolkit by implementing Safety Planning Intervention for suicide prevention with rural veterans, a population with suicide disparities. Safety Planning Intervention is effective for reducing suicidal behaviours.
Methods and analysis In Aim 1, we will use participatory approaches and user-centred design to develop Consumer Voice and its methods. In Aim 2, we will pilot Consumer Voice by implementing the Safety Planning Intervention in two clinics serving rural VHA patients. One site will receive a current implementation strategy (Implementation Facilitation) only; the second will receive Implementation Facilitation plus Consumer Voice. We will use mixed methods to assess feasibility and acceptability of Consumer Voice. We will compare sites on preliminary implementation (reach, adoption, fidelity) and clinical outcomes (depression severity, suicidal ideation, suicidal behaviour). In Aim 3, we will evaluate Aim 2 outcomes at 20 months to assess sustained impact. We will gather qualitative data on sustainability of the Safety Planning Intervention.
Ethics and dissemination These studies are overseen by the Institutional Review Board at the Central Arkansas Veterans Healthcare System. We plan to use traditional academic modalities of dissemination (eg, conferences, publications). We plan to disseminate findings through meetings with other trainers in implementation practice so they may adopt Consumer Voice. We plan to share results with local community boards.
- mental health
- public health
- quality in health care
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
Strengths and limitations of this study
Rigorous, iterative process based on user-centred design.
Includes consumers/patients in several steps of toolkit development.
Consumers/patients are involved on research team who makes decisions about Consumer Voice.
Researchers will have difficulty detecting patient-level differences in outcomes due to relative infrequency of suicide.
Provides a toolkit on how to engage consumers/patients in implementation practice to generalise to other settings.
Introduction
Healthcare disparities are significant differences in receipt of, access to, quality of, or outcomes of healthcare between marginalised groups and reference groups.1 Healthcare disparities persist in the USA and in the Veterans Health Administration (VHA) for several marginalised groups who have experienced societal oppression.2 3 One reason disparities persist is that clinical interventions target patient factors only—patients’ individual attitudes, behaviours—and although these are necessary targets to reduce disparities, they are not sufficient.4 We must also intervene on broader structures—for example, cultures of change, policies, organisational climate. Implementation scientists can address these broad, organisational factors contributing to disparities by using implementation strategies.
Implementation strategies are implementation interventions to address known barriers to uptake of a clinical intervention.5 Implementation strategies are commonly targeted at providers, clinics, hospitals or systems, such as provider training, performance data feedback or securing new funding streams.6 For example, to reduce racial disparities in guideline-concordant cardiovascular disease care, one possible implementation strategy is to plan for, act on and re-evaluate quality improvement efforts among patients by race. Typically, implementers—researchers, quality improvement personnel, facilitators—select and tailor implementation strategies. Tailoring a strategy involves refinements or tweaks so that it fits better with local context and more precisely targets implementation barriers.7 Although existing implementation strategies have improved care for the general population,5 they may not be sufficient to improve care for marginalised populations.8 One potential solution to reduce healthcare disparities is to engage marginalised patients (referred to as consumers) in selecting and tailoring implementation strategies to better fit their needs.
Participatory approach to engage consumers in implementation
Participatory research is an approach in which consumers are actively engaged in the research process. Consumers might be informants, discussants, or partners in research with varying degrees of decision-making power and trust with healthcare or academic staff. Among marginalised populations, participatory research has enhanced retention in health disparities research,9 improved fidelity to clinical care,10 better health outcomes,9 and reduced inequities in access to, satisfaction of, and quality of care.11 In fact, the Agency for Healthcare Research and Quality recommends participatory research as a ‘gold standard’ to reduce disparities.12
Although implementers often engage healthcare staff, using participatory approaches to involve consumers throughout implementation does not often occur.13 In the most robust example of using a participatory approach to enhance implementation, quality improvement that included community members was more effective than technical assistance without a participatory approach for uptake of a depression intervention across diverse US healthcare settings.14
A participatory approach to implementation shares principles with participatory research, such as work funded by Patient-Centered Outcomes Research Institute in which consumers inform research outcomes most important to them. Yet, engaging consumers in implementation is distinguishable from participatory research by its focus beyond intervention and outcomes to broader factors necessary to get patients, organisations and providers to be willing or able to implement the intervention. The benefit of this study is that we will engage consumers to focus on strategies to increase uptake of an intervention rather than more typical consumer engagement to determine components of an intervention or outcomes.
Gap in implementation and purpose of the current study
Implementation scientists need to reduce disparities in uptake and reach of interventions.15 Although consumer engagement in implementation has nascent evidence of improving healthcare among marginalised populations,16 methods for involving consumers in selecting and tailoring implementation strategies are not well synthesised or documented. Thus, participatory approaches to implementation are used less frequently than ideal, not well operationalised or reported, and not well studied as potential mechanisms for decreasing healthcare disparities. A Cochrane review of consumer engagement in healthcare called for greater specificity on how consumers are engaged and what resources are needed so processes and positive effects can be replicated.17
The purpose of this study is to augment a conventional method of selecting and tailoring implementation strategies (Implementation Facilitation) with consumer engagement and assess feasibility, acceptability and preliminary impact of consumer engagement on implementation and clinical outcomes. We will systematically develop a toolkit, Consumer Voice, to guide processes for engaging consumers in selecting and tailoring implementation strategies. We will pilot it by implementing the Safety Planning Intervention to prevent suicide among rural VHA patients.
Conventional strategy: Implementation Facilitation
To evaluate Consumer Voice, we will pair it with a conventional implementation strategy, Implementation Facilitation.18–20 Implementation Facilitation is defined as ‘a process of interactive problem solving and support that occurs in the context of a recognized need for improvement and a supportive interpersonal relationship’.21 22 Implementation Facilitation involves methods to select and tailor implementation strategies and does not usually involve interfacing with consumers.
Intervention to be implemented: Safety Planning Intervention among rural VHA patients
Approximately six VHA patients die by suicide daily.23 Compared with urban dwelling US veterans, rural dwelling US veterans are more likely to consider suicide and less likely to access mental healthcare.24 Safety Planning Intervention is a suicide prevention intervention, effective at reducing suicidal ideation and suicidal behaviours, inside and outside VHA.25 26 Safety Planning Intervention is a one-session, clinical intervention for patients with suicidal thoughts or behaviours. Patients and providers collaboratively create a safety plan with prompts populated in the VHA electronic health record and a copy given to patients.25 A complete Safety Planning Intervention safety plan consists of six types of coping skills that patients can use when suicidal thoughts arise.
Specific aims
Using a participatory approach, we will develop a toolkit (Consumer Voice) containing methods to engage consumers in selecting and tailoring implementation strategies.
Using a two-arm design, we will pilot feasibility and acceptability of Consumer Voice and its preliminary impact on implementation and clinical outcomes by implementing Safety Planning Intervention.
We will compare Implementation Facilitation to Implementation Facilitation plus Consumer Voice on sustainability of Safety Planning Intervention and assess factors that enhance or hinder sustainability of Safety Planning Intervention.
Methods and analysis
Patient and public involvement
The development of this research question was informed by patient and public opinion through our VHA centre’s Veterans Research Council. The lead author met with them as a group, presented research ideas, integrated some of their feedback while maintaining decision-making power, and returned to the council a second time to refine ideas before submitting this research for external funding. We also incorporated patient and public involvement in the design of Aim 1, especially recruitment strategies and locales suited for patients, by consulting with three patient representatives working in community organisations serving VHA patients in our US state.
Although this is a protocol, we began early components of the study and added two community member consultants (Veterans) on our research team that makes decisions about the form and function of Consumer Voice. For dissemination, we plan to share results with our local community boards, such as the community service organisations serving VHA patients and our local Veterans Research Council. We also plan to create an infographic of key results and distribute on social media from our research centre.
Theoretical approach
We will use the Health Equity Implementation Framework27 (see figure 1) to inform this research. This framework posits domains that predict successful implementation and reductions in implementation disparities. Within each domain are several determinants or specific factors that are measurable and, together in constellation with other determinants, clarify barriers, facilitators, moderators or mediators to equitable and successful implementation. The framework also proposes a process—Implementation Facilitation—by which change in each domain would occur.28 29
Some examples of domains in the Health Equity Implementation Framework are described below. Innovation refers to the treatment, intervention, practice, or new ‘thing’ to be implemented (ie, the Safety Planning Intervention), adopted by providers and staff, and delivered to patients.30 Recipients are individuals who influence implementation and those who are affected by its outcomes (ie, rural VHA patients, VHA staff and providers), at the individual and collective team levels.29 Cultural factors of recipients are unique characteristics to a particular group in the implementation effort (eg, patients, staff, providers) based on their lived experience. Some examples are implicit bias, socioeconomic status, stress related to discrimination, health literacy, health beliefs, or trust in the healthcare staff or patient group.31 32 Economies include how innovations are marketed and acquired (ie, government controlled healthcare at low cost) and other market forces that change demand for the Safety Planning Intervention (eg, it becomes offered at local urgent care clinics outside of VHA). Physical structures are where people have to visit to get healthcare and what environmental elements people may be exposed to that exacerbate or minimise the health problem.33 One factor in rural areas can be lack of confidentiality for suicide screening in a small town with few providers where many residents know each other.
We will use the Health Equity Implementation Framework to: (1) identify barriers/facilitators to using Consumer Voice (Aim 1), (2) identify barriers/facilitators for Safety Planning Intervention implementation among rural VHA patients that will guide Implementation Facilitation and Consumer Voice at local clinics (Aim 2) and its sustainability (Aim 3), and (3) interpret results from Aims 1, 2 and 3.
Setting
To reach a subset of rural VHA patients at risk for suicide, we will target rural VHA community-based outpatient clinics in Arkansas that house primary care and mental healthcare. One reason to implement suicide prevention in these primary care settings is because many veteran suicide deaths occur among those not engaged in mental healthcare who do seek primary care.23 Suicide prevention in primary care will reach more high-risk, rural veterans than in mental healthcare alone.
Study design, processes and planned analyses by specific aim
Aim 1: using a participatory approach, develop a toolkit (Consumer Voice) containing methods to engage consumers in selecting and tailoring implementation strategies
We will build a toolkit for use in engaging consumers in selecting and tailoring implementation strategies. Consumer Voice will be a multimedia manual showcasing who, what, when, where, how and why implementers should engage veterans (as consumers of healthcare) in implementing new or improved healthcare services. Our team will build the first draft of Consumer Voice based on a complete environmental scan of existing examples of consumer engagement in implementation activities.34
User-centered design
We will build Consumer Voice to expand Implementation Facilitation by using a QUALITATIVE→quantitative→QUALITATIVE structure through three sequential steps in which qualitative data will be given more weight (figure 2).35 Drawing from user-centred design,36 we will use an iterative approach to engage end-users (implementers) and other stakeholders in initial prototype testing and then mini-pilot tests of Consumer Voice. Our three sequential steps are: (1) conduct individual qualitative interviews and cocreation sessions with diverse stakeholders, (2) ask implementers to pilot Consumer Voice briefly in their own work and reconvene through a Delphi process to achieve consensus on components, and (3) reconvene diverse stakeholders again in a nominal group technique process to clarify the most feasible and important components for the final prototype of Consumer Voice. Within each step, we will use a variety of user-centred design methods such as interviews about user perspectives, applying process maps to visualise system-level implementation activities needed for Consumer Voice, cocreation sessions in which stakeholders develop some aspects of Consumer Voice alongside our team, and experience sampling (ie, implementers briefly pilot using Consumer Voice in their work).36
Step 1: stakeholder qualitative interviews
We will conduct interviews with key stakeholders (see table 1) to refine operational definitions of consumer engagement in implementation methods, preferences or needs, potential barriers to and facilitators of using these methods, and technical resources needed for Consumer Voice. We expect to achieve saturation between 12 and 20 total interviews.37
We will reach out to existing contacts in each stakeholder group for potential participation to recruit stakeholders in a respondent-driven, non-probabilistic approach. These contacts will serve as referral agents who suggest other stakeholders in any group for recruitment. We have built connections and partnerships with two veteran community groups. Stakeholders will be offered financial payment as incentive.
Interview guide
The interview guide will be structured to assess preferred types of engagement and technical resources; see sample questions in table 2. Interviews will be audio recorded, approximately 45 min long, and interviewers will take notes during the interview.
Qualitative analysis
We will use a Rapid Assessment Process to analyse qualitative data from stakeholder interviews. The time required for this approach can range from 4 days to 6 weeks.38 This method is useful for an implementation study in which there is a time-sensitive demand for creation and modification of an implementation product (Consumer Voice), yet need for rigour in the analysis.39 The analysis will blend inductive and deductive approaches, using directed content analysis40 and allow a framework to guide analysis deductively while leaving room for emerging information. We will use the Health Equity Implementation Framework to create summary templates to categorise barriers and facilitators. We will present results to veteran community groups focused on suicide prevention to give feedback to inform the next iteration of Consumer Voice.
Step 2: Delphi process with implementation experts
We will ask implementers to use a beta version of Consumer Voice in their own work as an uncontrolled pilot. Then, using a modified Delphi process that will produce quantitative data from voting, we will generate consensus on Consumer Voice through rounds of discussion and voting with those implementers.41
We will use respondent-driven sampling to identify up to 12 implementation experts by advertising on Twitter, and approaching professional implementation networks. We will ask these participants to reach out to one other potential participant through e-mail or social media.
Experts will be engaged in 2–3, 60 min, virtual Delphi sessions using online polling and discussion to reach consensus through videoconferencing platforms, Microsoft PowerPoint and telephone calls. The two sessions will follow this cycle: (1) present the draft version of Consumer Voice, elicit discussion from participants based on experience, and vote on which components to include (70% agreement achieves consensus)42; and (2) present group results back to participants and elicit discussion to vote again on which components to include. Implementers will receive the final version of Consumer Voice and monetary payment as an incentive.
We will also administer a one-time set of three questionnaires produced by Weiner et al,43 four questions each, assessing feasibility, acceptability and appropriateness of the beta version of Consumer Voice. Responses are on a Likert scale ranging from 1 (completely disagree) to 5 (completely agree). Example items include ‘Consumer Voice is appealing to me’ (acceptability), ‘Consumer Voice seems fitting’ (appropriateness) and ‘Consumer Voices seems doable’ (feasibility).
Step 3: nominal group technique to finalise Consumer Voice
Finally, we will use the nominal group technique with stakeholders to prioritise final components of Consumer Voice after step 2 (post-Delphi version) based on stakeholder rankings of importance and feasibility. The nominal group technique is a participatory research method in which exploratory questions about a topic are presented to small stakeholder groups to generate ideas, develop consensus and set priorities for guidelines, particularly for research areas that are underdeveloped.44
We will host 2–3 2-hour meetings with subsets from the diverse stakeholder groups in table 1. Each stakeholder will attend only one meeting. We will offer very small groups (eg, 2–4 individuals), varying locations that can be private and confidential, and even individual interviews should a stakeholder prefer not to discuss these topics with others. Our sampling is consistent with recommendations for the nominal group technique: emphasis is on involving people from different roles/locations to ensure heterogeneity of viewpoints.45
Participants will be provided an explanation of nominal group technique, key terms used in discussion, and a draft of the Consumer Voice toolkit. Participants will also be provided with preprinted forms that specify exploratory response questions. The exploratory questions will be honed through initial individual stakeholder interviews; they will likely resemble1: ‘What do you think are the most important and feasible ways to engage VHA consumers in implementing a healthcare intervention?’2; ‘What are other methods or ways to engage VHA consumers in implementation?’. Participants will be able to select, adapt and suggest new methods in their lists. Participants will be provided 15 min to brainstorm in silence followed by an oral round of listing ideas on flipcharts, serial discussion of each idea, group ranking of priorities, group discussion of rankings and re-ranking until consensus is reached.
Analysis to finalise Consumer Voice
The analytic plan is to use and connect data gathered after each of the three steps to form iterative prototypes of Consumer Voice.35 After each step, an analysis team (authors ENW, IAB, JEK, CW, veteran consultants) will meet to integrate data gathered from the prior step, using brainstorming and consensus, and decide how to integrate changes into the next Consumer Voice prototype. Data may take the form of suggested visual changes, stakeholder needs, suggested methods, activities or archival examples of consumer engagement in implementation. The analysis team will categorise the function of each consumer engagement method/activity on a continuum from least intensive to most intensive (eg, from informing consumers to partnering with them).46 One likely challenge we expect is for findings from stakeholders to diverge. The analysis team will work to resolve discrepancies during mixed-methods analysis between each step.47 In the final joint nominal group technique session, we will present remaining discrepancies to diverse stakeholders and elicit feedback on how to resolve, lending priority to different groups based on the function or form of the discrepancy (eg, clinical expert opinions will be given priority on components of clinical intervention delivery).
Aim 2: using a two-arm design, we will pilot feasibility and acceptability of Consumer Voice and its preliminary impact on implementation and clinical outcomes by implementing Safety Planning Intervention
We will use the Consumer Voice toolkit to conduct engagement meetings, events and interactions with rural veterans who have experienced suicidal thoughts or behaviour and their families in selecting and tailoring implementation strategies for Safety Planning Intervention. During and after these interactions, we will conduct a mixed-methods process evaluation of the Consumer Voice toolkit and process. We will conduct a pilot study using an effectiveness-implementation hybrid 2 design comparing Implementation Facilitation only with Implementation Facilitation plus Consumer Voice on implementation and clinical outcomes.48
Sites
Within our VHA regional system, one community-based outpatient clinic (referred to as ‘clinic’) will be the ‘standard care’ clinic at which Implementation Facilitation alone is used; the second clinic will be the ‘implementation site’ at which Implementation Facilitation plus Consumer Voice is used (table 3). We randomly assigned each site’s implementation condition. The sites are matched on clinic size and percentage of veterans defined as rural. One possible challenge is that sites might drop out of the study. If a site is unable to participate, mental health leaders at our VA facility identified alternate sites for this study.
Timeline for Safety Planning Intervention implementation and data collection
Implementation will occur in four phases, each lasting 4 months: planning, pre-implementation, implementation and sustainability. Although time periods are short compared with larger trials, they will allow sufficient time to determine feasibility and acceptability of Consumer Voice (table 3).
Implementation strategies across implementation phases
There will be one facilitator who will use conventional Implementation Facilitation at the standard care clinic, and Implementation Facilitation plus Consumer Voice at the implementation clinic. Implementation Facilitation and Consumer Voice will occur on the same timeline, although we anticipate there to be additional or different activities at the Implementation Clinic in which Consumer Voice is used in conjunction with Implementation Facilitation. The facilitator will track their weekly time and activities related to Implementation Facilitation using pre-established tracking logs49 and key Implementation Facilitation events that occur using a pre-established checklist from our preliminary work.50 The facilitator will use these logs to document whether clinics receive the same amount and type of activities of Implementation Facilitation.51
One anticipated challenge is that Consumer Voice participants may drop out over the course designing for implementation. In preparation for this, we will track retention as an outcome for the process evaluation (see table 4). If drop out occurs, we will apply similar techniques as in original recruitment, identify new participants and spend 1–2 hours orienting them to consumer engagement and implementation, the development of Consumer Voice, and study progress to date.
Mixed-methods process evaluation of Consumer Voice
Because the end-user of Consumer Voice will be implementers as they will use Consumer Voice methods to engage consumers, we need to assess feasibility and acceptability of using the Consumer Voice toolkit and methods with implementers as well as consumers. We will conduct a mixed-methods process evaluation of Consumer Voice.52 We will use a qualitative+QUANTITATIVE design; data will be collected simultaneously and importance will be given to quantitative measures.35 53
Procedures
The function of these mixed-methods data will be convergence, which involves integrating them to answer the same question: is Consumer Voice feasible and acceptable to all stakeholders?.35 To assess feasibility, we will administer brief surveys at consumer engagement events to all consumers and healthcare professionals and use logs for tracking data in real-time during these interactions. To assess acceptability, we will use (1) the same surveys and logs used for feasibility data collection to assess retention and physical safety, and (2) brief qualitative interviews with consumers and healthcare professionals to assess burden and satisfaction. We will also attempt to interview consumers who responded to initial recruitment but did not attend or dropped out about reasons for non-attendance. This strategy allows for greater external validity by ensuring broader variability in the data.54
Measures
We will assess feasibility outcomes suggested by Orsmond and Cohn55 as seen in table 4. Acceptability outcomes were designed based on recommendations from Proctor and colleagues as seen in table 5.56 We will administer again Weiner’s three questionnaires,43 four questions each, assessing feasibility, acceptability and appropriateness of Consumer Voice that was used in Aim 1.
Analysis
To integrate data, we will merge information from quantitative and qualitative datasets.35 We will use descriptive statistics to analyse quantitative data. Qualitative data from surveys and interviews will be extracted into summary templates aligned with the Health Equity Implementation Framework.27 The coding team will analyse data using a blend of inductive and deductive approaches through the Rapid Assessment Process described in Aim 1.38 39 As one way to triangulate data to answer questions about acceptability and feasibility of Consumer Voice, some qualitative categories will be able to be quantified (eg, 0=not satisfied, 1=somewhat satisfied) and converged with quantitative data. Another way to triangulate data will be for the mixed-methods analytic team to meet together to present, review, discuss and integrate findings from quantitative and qualitative data.
Assessing preliminary impact of Consumer Voice
As part of the pilot, we will also assess implementation outcomes of reach, adoption and fidelity to Safety Planning Intervention and clinical outcomes of patient depression, suicidal ideation and suicidal behaviour. This pilot will not have enough statistical power to detect a conclusive effect of Consumer Voice on implementation or clinical outcomes. The pilot study will allow us to obtain SD estimates of clinical outcomes for sample size determination of future trials.
Measures
To evaluate preliminary implementation and clinical outcomes, we will use Reach, Effectiveness, Adoption, Implementation, Maintenance (RE-AIM)57 as a framework (RE-AIM). We will collect these data from both clinics during month 13 after the implementation phase. Reach is defined by Safety Planning Intervention being used with the targeted patient population (ie, rural veterans with suicidal ideation or behaviour). Effectiveness is conceptualised as whether veteran depression symptoms and suicidal ideation and behaviour is different because of exposure to Safety Planning Intervention. Adoption is conceptualised as Safety Planning Intervention uptake by providers in primary care and specialty mental healthcare roles at each clinic. Implementation is conceptualised as high fidelity to the Safety Planning Intervention. We will randomly select 30% of rural veterans exposed to Safety Planning Intervention from both clinics to assess implementation fidelity of Safety Planning Intervention. Using this sample, we will conduct chart reviews of the safety plans created in the medical record to assess the quantity of Safety Planning Intervention steps completed; a complete safety plan involves six steps. Table 6 lists planned outcomes and sources from which we will collect data to evaluate these outcomes.
One possible challenge with the adoption measure is that there may be very low adoption overall, and thus, we might need to increase the percentage of chart reviews to identify differences in adoption between sites. One limit to the Implementation measure is that it is a basic fidelity assessment that does not capture quality of completion of SPI safety plans. It is possible that a fidelity measurement focused on quality will be needed, and if so, I will use rating tools created by the VHA Safety Planning Intervention training group.
Patient sample
To assess clinical effectiveness outcomes, we will analyse a sample of rural VHA patients within both clinics. We will include patients who screen positive on a suicidal ideation question at primary care appointments (ie, Patient Health Questionnaire-2+Item 9 (suicidal ideation)). Because data on patient clinical effectiveness will be extracted directly from VHA administrative data, patients sampled in each clinic will represent a convenience sample (vs a random sample), therefore; there is no clear sample size.
Analysis
To assess preliminary impact of Consumer Voice on implementation outcomes of reach, adoption and implementation fidelity, we will calculate descriptive statistics. We will not conduct effect sizes from this pilot study due to concerns about inflation of type I and II errors in small samples sizes.55 58 We will describe variance in the outcomes detailed in table 6, including CIs about each point estimate (eg, mean, SD).58 59
To evaluate the hypothesis that Consumer Voice will improve patient clinical outcomes, we will conduct inferential statistics. We will conduct an analysis of covariance and compare differences in patient outcomes between clinics that receive standard Implementation Facilitation and Implementation Facilitation plus Consumer Voice at the month 13 time period, while controlling for baseline level of depression at each site during the 4-month planning period. Independent variables will be implementation assignment and time. The dependent variables will be depression symptoms, suicidal ideation and self-directed violence.
Aim 3: Evaluate sustainability of Safety Planning Intervention
One metric of the impact of Consumer Voice is how well Safety Planning Intervention is sustained in a clinic that used Implementation Facilitation only compared with a clinic that received Implementation Facilitation plus Consumer Voice.
Design
Therefore, we will use mixed-methods (QUANTITATIVE+qualitative)35 to compare the two clinics on: (1) repeated implementation and clinical outcomes at months 19–22 (observation period) and (2) barriers and facilitators to Safety Planning Intervention sustainment. Quantitative data collection to assess implementation and clinical outcomes will precede qualitative data collection by 1 month to document the fidelity of Safety Planning Intervention at a later timepoint. Qualitative data will be used to assess stakeholder perceptions of Safety Planning Intervention sustainability barriers and facilitators.
Measures and analysis of quantitative data
For maintenance (sustainability), we will measure outcomes guided by the RE-AIM framework for reach, effectiveness, adoption and implementation again in Month 19, as reviewed in table 6. For analysis, we will repeat detailed analysis described in Aim 2.
Sampling and recruitment for qualitative interviews
We will interview again a subset of stakeholders listed in table 1. The purpose of the interviews will be to assess barriers to and facilitators of Safety Planning Intervention sustainment at each clinic to document differences between the contexts of the standard care clinic (Implementation Facilitation only), and the implementation clinic (Implementation Facilitation plus Consumer Voice). Participants will be sampled purposively, by selecting those that were most informative during stakeholder interviews and the nominal group technique in Aim 1 and implementation in Aim 2, and those that presented ‘negative cases’ in Aim 2 (ie, preliminary results that did not fit with majority of information used to implement Safety Planning Intervention).54 The interview guide will be semi-structured, with questions aligned to the Health Equity Implementation Framework.
Analysis of qualitative data
We will use the blended inductive-deductive analysis40 through a Rapid Assessment Process described in Aim 1.39 Initially, the analysis will be deductive and focused on these specific questions: How has Safety Planning Intervention been sustained? Which implementation strategies contributed to its sustainment? How has consumer involvement affected Safety Planning Intervention sustainment? We anticipate these interviews may elucidate potential mechanisms of change that we would investigate in a subsequent, fully powered trial of Consumer Voice.
We will also compare Sustainability Action Plans completed for each site in the sustainability phase, including any updates to the plans. Specifically, we will code for three criteria within each Sustainability Action Plan: (1) communication between consumers and clinic, (2) consumers involved in developing or reviewing the plan and (3) consumers being sampled for some metric of sustainability (eg, consumer satisfaction, use of Safety Planning Intervention). These criteria were informed by a consumer partnering subscale of a reliable, quantitative sustainability measure.60
Ethics and dissemination
Ethics
A major innovation of this study is the integration of a participatory research approach with implementation science; this novel approach has potential for ethical pitfalls. Participatory research and implementation science come from distinct research traditions. Although there are shared goals, they are also distinctly different on their ethical approach.61
Regarding motivation, both approaches want to improve society. Participatory research is geared more to create social change and build capacity among users; implementation science is geared more to apply knowledge to help users, although not explicitly to build capacity among them. Regarding social location, both approaches want knowledge users involved in the healthcare system. Participatory research is rooted in grassroots, user led action (equalising power differentials); while implementation science is rooted in decisions made by healthcare professionals. The main aim is not to equalise power between researchers and users, although this may occur. Both approaches propose users should be engaged in an ethical manner, although sometimes ethical is defined by consumers in participatory research, but by researchers in implementation science.
As we conduct this work, we will have to recruit and retain engagement with multiple stakeholders and pay careful attention to inputs of consumers and processes used, to create an implementation strategy and toolkit that truly exemplifies strengths from both traditions. We will need to pay extra attention to work collaboratively, inclusively, and with respect for people living in rural communities, using suggested best practices by experienced community engaged researchers such as using a variety of participation strategies, allowing extra time for building trust, being a regular presence in the community, and including local customs in interventions or implementation.62
Dissemination
In this pilot, there is narrow focus on Safety Planning Intervention implementation and Consumer Voice will require adaptation to other evidence-based practices. Although we are collecting preliminary impact data about implementation and patient outcomes, we will be unable to draw strong conclusions about these research questions.
We plan to use traditional academic modalities of dissemination, including conference presentations and journal publications. We also plan to disseminate findings through meetings with other trainers and teachers in implementation practice so they may adapt or adopt Consumer Voice to meet their needs. Although VHA has no publicly available data repositories, we will make data from our studies available on request.
Ethics statements
Patient consent for publication
Acknowledgments
Thank you to the Veterans Research Council at the VA Center for Mental Healthcare and Outcomes Research, along with Veteran Service Officers in the state of Arkansas, for their collective feedback on elements of this research design. ENW is a fellow with the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through an award from the National Institute of Mental Health (5R25MH08091607).
References
Footnotes
Contributors ENW conceptualised the study and manuscript, and prepared all written materials, tables and figures. CW helped design methods and integration of mixed methods and edited the manuscript. SJL helped conceptualise the involvement of implementing Safety Planning Intervention and edited the manuscript. LRMH refined the study design and methods and edited the manuscript. KLD helped design analytic plans for qualitative components and edited the manuscript. SO designed analytic plans for quantitative components and edited the manuscript. IAB helped refine methods for Aim 1 (Developing Consumer Voice), edited the manuscript and developed supporting documentation, such as declarations, abbreviations, references and supplemental files. JEK helped develop conceptualisations of the study and manuscript and edited the manuscript.
Funding This work was supported by Career Development Award Number IK2 HX003065 from the US Department of Veterans Affairs Health Services Research and Development (HSRD) Service (ENW).
Disclaimer The views expressed in this article are those of the author and do not necessarily represent the views of the US Department of Veterans Affairs.
Competing interests None declared.
Patient and public involvement Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details.
Provenance and peer review Not commissioned; externally peer reviewed.