Objectives To identify the facilitators and barriers to implementing patient-reported outcome measures (PROMs) in third sector organisations (TSOs) delivering health and well-being services.
Design A qualitative interview study. Participants were recruited using purposive, opportunistic and snowballing methods. Framework analysis was used.
Setting TSOs including charities, community groups and not-for-profit organisations in England, UK.
Participants Thirty interviewees including service users, TSO front-line workers and managers, commissioners of TSOs and other stakeholders such as academic researchers.
Results TSOs primarily used PROMs because of pressures arising from the external funding context. However, organisations often struggled to implement PROMs, rarely getting the process right first time. Facilitators for implementation included having an implementation lead committed to making it work, investing resources in data management systems and support staff and taking a collaborative approach to designing the PROMs process. The latter helped to ensure an appropriate PROMs process for the specific TSO including choosing a suitable measure and planning how data would be collected, processed and used. There was a dilemma about whether TSOs should use standardised well-being measures (eg, the Warwick-Edinburgh Mental Well-being Scale) or design their own PROM. Not all TSOs sustained the collection and reporting of PROMs over time because this required a change in organisational culture to view PROMs as beneficial for the TSO and PROMs becoming part of front-line workers’ job specifications.
Conclusions TSOs are trying to use PROMs because they feel they have no choice but often struggle with implementation. Having an implementation lead, designing an appropriate process, investing resources, training staff and taking mitigating action to address potential barriers can facilitate implementation. Some of the findings are consistent with the experiences of more clinical services so appear relevant to the implementation of PROMs irrespective of the specific context.
- public health
- qualitative research
- health economics
- health policy
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Strengths and limitations of this study
First piece of published research specifically focusing on the implementation of patient-reported outcome measures (PROMs) in third sector organisations (TSOs).
Identified a number of findings useful to commissioners and TSOs to improve the implementation of PROMs.
Some of the findings may be relevant to healthcare services.
It would have been useful to interview more people from larger TSOs and from organisations who had stopped using PROMs.
Patient-reported outcome measures (PROMs) are standardised questionnaires which measure patient-reported outcomes such as a person’s health, well-being or symptoms.1–3 If a person answers a questionnaire at two or more time points, for example, before and after receiving support, scores can be compared to understand whether there is any change. Generic PROMs which measure a person’s overall health include the EuroQol-5 Dimension1 and 36-Item Short Form Health Survey.4 Examples of PROMs which focus on well-being include the Warwick-Edinburgh Mental Well-being Scale (WEMWBS),5 the Office for National Statistics Well-being Questions (ONS4)6 and the Personal Well-being Scale.7 PROM scores can be used on an individual service user level to inform their support or the scores of multiple service users can be aggregated to evaluate the impact of a service.1 Policymakers and healthcare services are increasingly attempting to implement PROMs because they can improve communication between clinicians and service users, resulting in improved care and outcomes.8 9 Furthermore, aggregated PROMs are used by commissioners to hold services to account for offering health benefit. For example, the UK’s PROMs programme mandates that hospitals use PROMs for hip and knee replacements.10 In the USA, the Patient-Reported Outcomes Measurement Information System is being implemented.11 Despite the intent to use PROMs, healthcare services can struggle with implementation, resulting in low completion rates.12
Implementation is defined as the process from a service deciding to use PROMs to when they are part of routine practice.13 To improve implementation, researchers have sought to identify potential facilitators and barriers.14–17 To date, this work has been undertaken in clinical services. A recent systematic review of reviews14 identified a number of facilitators including using PROMs to tailor a service user’s care, the importance of choosing an appropriate measure and the need to design a straightforward process for collecting, analysing and using the PROMs data. Furthermore, having an implementation lead to progress implementation and engage and train staff is necessary. The review also identified that organisations need to reflect and develop the PROMs process if problems arise. Importantly, the review identified that many of these issues were bidirectional, in terms of becoming barriers if not undertaken by an organisation. For example, staff may not use PROMs if they find the data collection process complex. Other studies have identified similar facilitators and barriers,15–17 with some questioning whether organisations have sufficient resources to invest in the PROMs infrastructure18 and whether the use of measures is sustained.19
Generic implementation theories such as the Knowledge to Action framework20 or the Consolidated Framework for Implementation Research (CFIR)21 may also be useful for identifying issues affecting the use of PROMs. A recent review of PROMs used the CFIR,14 showing how previous PROMs research had not considered the influence of an organisations' characteristics or external influences on implementation, even though these are considered relevant within implementation theories.21
To date, research on implementing PROMs has focused on clinical services and not considered PROMs usage within third sector organisations (TSOs).14 TSOs, also known as charities, voluntary or community organisations, are increasingly commissioned to deliver health and well-being services (called ‘well-being services’ in this paper) within the UK22–24 through initiatives such as social prescribing, advocacy services and community allotments.25 26 Often TSOs receive short-term funding to deliver their services, with organisations having to demonstrate their impact on the health and well-being of service users to justify further funding.27 PROMs are one approach that TSOs use to demonstrate their impact. However, little is known about how to implement PROMs within TSOs and a recent review recommended research was needed14 because it is not clear how transferable known facilitators and barriers to implementing PROMs in clinical services are to TSOs. This is because TSO delivered well-being services differ from clinical services as they are often run more informally, support is from peers rather than healthcare professionals, attendance may be long term and service users may access multiple services within a TSO rather than receiving one specific intervention.28–30 Given this gap in knowledge, the study aimed to identify the facilitators and barriers to implementing PROMs in TSOs.
A qualitative interview study of multiple stakeholders was undertaken for an in-depth exploration of different TSOs’ experiences of implementing PROMs in England.31 The Consolidated Criteria for Reporting Qualitative Research checklist was used to guide reporting32 (online supplemental file).
Participants who had different connections to the use of PROMs in TSOs were recruited including service users, front-line workers and managers, commissioners and other relevant stakeholders (eg, academic researchers). Further detail is provided in table 1. Recruitment was undertaken through using a range of sampling strategies including purposive, opportunistic and snowballing approaches.33 Purposive sampling involved targeting people because of their professional roles, such as approaching commissioners who funded TSOs. Opportunistic sampling entailed promoting the study through networks including visiting TSOs. Finally, snowballing was used because some interviewees recommended other people to approach. Thirty-five people were invited and five individuals did not respond so they were not interviewed. Potential interviewees were provided with a Participant Information Sheet and Consent Form when making initial contact and written consent was collected before individuals were interviewed. Recruitment stopped after 30 interviews because the sample was suitably diverse, the information power was high34 and saturation had been reached on some central themes.35
Use of the CFIR
We used the CFIR in this study because it amalgamates a number of implementation theories,13 has been used in a previous review of PROMs14 and provides a framework of 36 constructs which may influence implementation, structured around five different domains. These include21:
Outer setting—factors outside of the organisation, for example, external policies and incentives.
Inner settings—characteristics of an organisation, for example, its culture and structural characteristics.
Characteristics of individuals—how people influence implementation, for example, front-line workers’ knowledge and beliefs about the intervention.
The intervention—the PROMs process, for example, its complexity and adaptability to the specific context.
Process—factors relating to getting PROMs used such as planning and reflecting and evaluating implementation.
Data collection and analysis
Semistructured interviews were used so that similar questions could be asked of all participants while also providing scope to explore arising issues.36 AF undertook all interviews, predominately conducting them face to face, using telephone interviews when geographical distance was prohibitive. Participants chose the location of the interview,usually this was at the TSO. The topic guides incorporated the CFIR constructs by asking about which measures were used and why, staff engagement, knowledge and beliefs about PROMs, available resources and reflecting and evaluating implementation. The guides were tailored to each interest group.
The interviews were audio recorded, transcribed verbatim and imported into NVivo V.11.37 Framework analysis was undertaken, entailing the steps of familiarisation, identifying a thematic framework, indexing, charting, and mapping and interpretation.38 Transcripts were read for familiarisation. The thematic framework was developed from findings of a systematic review on implementing PROMs14 and constructs of the CFIR.21 The framework was further developed to account for additional issues identified within the transcripts.39 Data were coded to the framework. During the mapping and interpretation stages of analysis, the themes evolved beyond the CFIR because the findings often transcended several constructs and it was important to use the language of the participants. The analysis was primarily undertaken by AF, with AOC and JH each coding an early transcript for team discussion and providing substantial input into the analysis.
Patient and public involvement
Service users were actively involved in the study including supporting the development of the research, designing the recruitment materials such as Participant Information Sheets, advising on the recruitment strategy and reviewing the topic guides. AF consulted the service users at each stage of analysis to help with interpreting the findings.
Thirty people were interviewed, which included at least five people per interest group (designated by their current role in relation to TSOs) to enable different perspectives to be explored (table 1). Participants were involved in different sized TSOs including neighbourhood-based organisations and national TSOs. Interviewees were primarily located in the North of England (n=24). The majority of interviews were face to face (n=22), with eight by telephone. Interviews were generally an hour long, although the majority of service user interviews were shorter (average length 25 min) because they did not have views about organisational issues.
Overview of factors influencing implementation of PROMs
Multiple factors appeared to influence implementation, some related to the internal and external context of TSOs, while others arose from the process of using PROMs. Figure 1 encapsulates these issues. In table 2, we explain how each theme relates to the CFIR constructs. The majority of the CIFR constructs were identified in the data. The main exception was planning, with interviewees not discussing whether their TSO planned the implementation process.
Each of the factors influencing implementation is presented separately within the findings; however, in practice they interacted and influenced each other, acting as facilitators or barriers depending on how an organisation approached the issue. For example, whether front-line workers used a PROM depended on whether they felt the choice of measure was appropriate in terms of its length, the relevance of the questions and the accessibility of the language.
External context: PROMs are compulsory
A dominant narrative was interviewees believing TSOs have no choice but to engage with PROMs due to funding requirements. Interviewees from all the interest groups discussed how TSO’s funding came from time-limited contracts and grants. In a national context of austerity, and the trend for outcomes-based commissioning, TSOs were required to measure the benefits of funded services and show value for money to demonstrate accountability. Consequently, TSOs were subject to external policies where commissioners required TSOs to collect PROMs as a condition of funding contracts. This was challenging for organisations because they were funded by multiple commissioners so had to incorporate all of their specific requests in respect of PROMs. Additionally TSO managers needed PROMs data to support future funding applications. Front-line workers and service users complied with completing PROMs because they understood that funding was needed to enable well-being services to continue. Indeed, some service users felt compelled to complete PROMs in order to access services.
The reality is that you know money is getting tighter and tighter. Whether its grants or contracts […] the only way you’ll attract funding is to be able to show that you make a difference and that you have an impact. (TSO manager 4)
Not all interviewees signed up to a ‘no choice’ narrative. They pointed out that individual commissioners took different approaches to PROMs, healthcare services were not having to use PROMs to justify funding and that there was a lack of transparency in how the PROMs data influenced funding decisions.
Organisational commitment: organisational culture and investment can facilitate PROMs
The organisational characteristics of culture and willingness to invest resources into PROMs appeared to affect implementation. Interviewees felt that the culture of TSOs had a bidirectional influence on PROMs. Facilitating aspects included organisations being proactive in adopting new working practices and having good networks among staff, where front-line workers supported each other with using measures.
I think as an organisation we are quite good at being fluid, you know and having a go at things and seeing if they work. (TSO manager 4)
However, some interviewees felt that collecting PROMs detrimentally affected the dynamics of well-being services especially group social activities or when a service user was receiving short-term advocacy support.
TSOs prioritising investment of sufficient resources in implementation was considered to be a pertinent issue by interviewees. This included investing in data management systems and support staff to process PROMs, and training front-line workers. However, TSO managers raised concerns about sustaining investment because they did not consider resourcing PROMs to be part of their core costs. For example, one manager was uncertain whether they could continue to fund a data manager.
Funds are tight for us and it’s one of those roles that I look at and think ‘is it a bit of a luxury?’ On the other hand, I do know that we’ve won funding because of the quality of the data that we’ve been able to provide to people so it’s a real balancing act. (TSO manager 3)
Staffing: strong leadership, buy-in from staff and support from external advisors can facilitate PROMs
The needs, skills and opinions of TSO managers and front-line workers as well as support from external advisors may influence implementation. Interviewees discussed the importance of having an implementation lead, that is, someone who took responsibility for implementing PROMs and offered strategic and operational management of the processes.
Cos when I first came it [the PROM] was just ad-hocly written into funding bids, thinking that they needed it. But nobody was managing it, nobody was managing the workers doing it, nobody was managing those expectations, nobody was really recording it properly and I was just like ahhhhh. How can you cope like this cos it needs to be managed? (TSO manager 7)
Challenges arose if no one within a TSO acted as implementation lead or when the lead did not engage with PROMs. For example, one manager explained how they did not consider PROMs a priority so had not invested time in progressing implementation.
Interviewees felt that front-line workers generally tried to engage with PROMs even if they considered the measures to be inappropriate and invalid. Negative opinions arose from workers feeling their service users’ lives were complex and positive changes may not be captured by an overall assessment of well-being. Additionally, front-line workers believed the language used in measures was too complex for their service users. Despite this, front-line workers discussed engaging with PROMs out of loyalty to their TSO and because they believed collecting PROMs could generate further funding, keeping them in a job. However, some front-line workers struggled to use PROMs as they were concerned that administering measures would damage their relationships with service users because of the seeming irrelevancy of these measures in the context of the serious difficulties people were facing.
But people who are coming to me with the social issues such as they can’t pay their rent or universal credit […] Then it really is irrelevant and some people get quite agitated at being asked to fill in such questions about their mental health, they haven’t actually come to me for a mental health consultation. (TSO front-line worker 1)
External advisors providing support with implementation were valued by some TSO managers because these interviewees did not feel they had the capacity or knowledge themselves. For example, one manager discussed how an external advisor designed the TSO’s data management system.
A collaborative approach improves the appropriateness of the PROMs process
The ‘designing stage’ of implementing PROMs where a TSO decides which PROMs to use and how to use them, appeared to be critical to the implementation process. Interviewees felt that taking a collaborative approach to ensure the design was appropriate, proportionate and straightforward was important. Collaboration involved commissioners working with, rather than imposing a PROMs process on an organisation and TSO managers consulting front-line workers and service users. Consulting front-line workers and service users was often reported as not occurring in our sample, with interviewees explaining that if PROMs had been imposed by commissioners, then there was little scope to consult service users and front-line workers. Participants felt externally imposed PROM processes were often inappropriate for an organisation’s specific service users, resulting in some TSOs struggling to collect PROMs. However, some organisations overcame the challenge through taking mitigating actions in other parts of the implementation process. For example, one TSO was required to collect a PROM they considered inappropriate but were managing to administer the measure through skilled front-line workers engaging service users. In another TSO, they implemented one PROM throughout the whole organisation and then negotiated with commissioners to be allowed to use this measure. Even if TSOs managed to collect imposed PROMs, interviewees questioned the quality of data generated.
It’s the sort of people that I’m using it on, it’s fundamentally flawed anyway cos some of them I have to, I deal with a lot of people who can’t read or can’t write or got dementia and that makes it irrelevant because they, you say the question and they say ‘ooh what number oh I think it was a three’, but they have no comprehension of what I’ve asked them. (TSO front-line worker 1)
Interviewees explained that TSOs needed to ensure the designed PROMs process was straightforward and proportionate to the specific service user group and organisation. For example, front-line workers discussed how they had to complete multiple PROMs which caused measurement burden and they wanted the process reduced to a single measure.
A dilemma: standardised PROMs or bespoke measures?
Interviewees differed on whether their TSOs used standardised PROMs or had designed their own bespoke measure. Organisations using standardised PROMs generally used well-being measures, with WEMWBS being the most used measure within the sample. Other measures included the Outcome Star and ONS4. Some interviewees believed standardised measures were more credible and using them enabled comparison with other organisations. Other interviewees designed a bespoke measure because they felt that existing PROMs were not appropriate for their context. Bespoke PROMs often drew on established well-being frameworks such as Five Ways to Well-being. Factors influencing the choice of PROM included the preferences of commissioners and implementation leads, experiences of similar TSOs and needing to avoid the licence fees associated with using certain measures.
Sometimes you think ‘ooh it would be good to have a validated tool in terms of being able to compare yourself to that organisation’ and things like that and it’s something we definitely have thought about… but it doesn’t mean they’re right and it doesn’t mean they’re going to work for you. (TSO manager 7)
Developing systems for processing and using the data generated from administering PROMs
TSOs planning how measures would be collected and the data processed, analysed and used appeared to facilitate implementation. PROMs were generally collected by front-line workers supporting service users to complete paper versions within face-to-face appointments. Some interviewees had unsuccessfully tried to use digital methods or asked people to complete PROMs independently before appointments; the service users interviewed were also resistant to these approaches. Interviewees from all the interest groups discussed the difficulty in identifying appropriate time points for collecting PROMs, especially when service users attended the TSO on a long-term or sporadic basis. Having sufficient time and resources within the organisation to process collected PROMs was also highlighted as a challenge. Some TSOs in the sample had invested in staff to perform these tasks and/or in data management systems. Components of data management systems included having the function to store details of PROM scores and systems to report individual service user and amalgamated PROM scores such as through visual dashboards. TSOs not investing in data management systems meant that paper-based PROMs could be collected but the data not processed or used. However, this could also happen if the systems were not fit for purpose.
We’ve set up a management information system and part of that system is to record outcomes and it’s just a new piece of technology, it’s a new way of doing things. It’s really you know looking at it now, and thinking maybe we didn’t get the right one because it’s just so time consuming and staff are just really resistant to it. (TSO manager 4)
A number of managers felt that they had good systems in place to ensure the PROM results were shared with and used by front-line workers and service users. For example, some interviewees spoke about being able to generate dashboards from their data management systems so that front-line workers and service users could view their PROM scores. However, several front-line workers and service users complained about not receiving feedback such as how individual users’ scores had changed. Front-line workers and service users found this frustrating because it meant they could not use the data to inform a service user’s care, making them less likely to engage with PROMs, affecting their sustainability.
When they gave me the second form to fill in I felt happier and said ‘oh now I’ll know if I’ve improved or not’. But when I ask for the result [….], ‘no this was for the records and I can’t access them’. I felt like I’d wasted my time thinking that I will know my score. (Service user 5)
The need for ongoing, practical and ideological training for staff using PROMs
Training front-line workers appeared to be important for facilitating the implementation of PROMs. Interviewees discussed how training should be both practical in terms of learning how to use measures, and also ideological so front-line workers understood the rationale for using PROMs. Managers and front-line workers felt that training needed to be ongoing including refreshers in team meetings and additional training given to individual front-line workers who were not engaging in PROMs.
Me and my manager did one [team meeting] about the importance of monitoring and where it comes from and what it means and the cycle of it and why we do it….just to refresh thinking. (TSO manager 7)
Sustaining the use of PROMs in routine practice: a long-term iterative process
Rarely did TSOs get the PROMs process right first time, resulting in front-line workers struggling to collect measures. Consequently, organisations had to further develop the PROMs process, sometimes by making fundamental changes such as using a different measure. Other organisations only needed to make small refinements, for example, by improving the data management system or staff training.
We thought ‘well we’ll give this [the PROM] a go because it’s been given to us’. But we doubt it’s going to work and fairly quickly by the end of the first quarter we were on our knees with it saying ‘we’ve got to change it’. (TSO front-line worker 2)
Having a trial period was suggested by one front-line worker as a potential way of overcoming these initial problems but none of the interviewees had tried this. It took time for PROMs to become part of routine practice. Interviewees felt that the long-term use of outcome measures was facilitated by front-line workers having PROMs incorporated into their job roles and TSOs undergoing organisational culture change so that they perceived PROMs as beneficial for the organisation such as the data being used to inform a service user’s care or to help generate funding. For example, several TSO managers spoke about setting PROM-related performance objectives for staff.
It’s in the bones, we could all leave and it would still be in the bones. I think it’s sort of, we’ve been on at it long enough now that it’s just, yeah part of our DNA and people know this is just what we do. (TSO manager 6)
In contrast, the length of time it took to implement PROMs was considered a barrier because TSOs rely on short-term funding. A couple of TSO managers in the sample discussed addressing this issue through developing an organisation-wide PROMs process.
Summary of findings
TSOs primarily used PROMs because of pressures arising from the external funding context. However, organisations often struggled to implement PROMs, rarely getting the process right first time. Facilitators for implementation included having an implementation lead committed to making it work, investing resources in processes and taking a collaborative design approach. The latter helped to ensure an appropriate PROMs process for the specific TSO including choosing a suitable measure and planning how data would be collected, processed and used including developing the supporting infrastructure such as data management systems. There was a dilemma about whether TSOs should use standardised measures like the WEMWBS or design their own measure. Not all TSOs sustained the collecting and reporting of PROMs over time because this required a change in organisational culture so that PROMs were viewed as useful to the organisation.
Strengths and limitations
The study’s strengths are that it is the first published research on implementing PROMs in TSOs, the research considered the whole implementation pathway and different interest groups were interviewed. The research would have benefited from having more interviewees from larger TSOs and from organisations that had stopped using PROMs.
Context of other research
Several factors identified were consistent with findings of studies based in healthcare settings whereas other issues appeared unique to TSOs, arising from their specific external and internal context. Key similarities related to designing the process, engaging staff and needing to improve the PROMs process. Implementation in both TSOs and healthcare settings appeared to be facilitated by organisations codesigning an appropriate and straightforward PROMs process, and planning how data would be collected, processed, analysed and used including sharing it with front-line workers and service users.14 19 40 The importance of having skilled and engaged staff who received sufficient training was consistently identified in studies based in different healthcare settings.14 15 19 Organisations experiencing problems when starting to use PROMs and needing to make improvements to facilitate sustainability have also been consistently documented.14 The similarity in findings between TSOs and healthcare settings is understandable because it has been proposed they are sufficiently alike to learn from each other.41
However, some findings appeared to be unique to TSOs or more prominent. First, TSOs were motivated to use PROMs to demonstrate their impact because of the sector’s specific funding context, whereas research based in healthcare settings focuses on using PROMs with individual service users to tailor their care.14 Second, TSOs were having to implement PROMs imposed on them by commissioners rather than having the scope to design their own process, which contrasts with good practice guidance on implementing PROMs.42 This research found that having an implementation lead was fundamental. Some but not all previous studies identified the importance of the lead. However, previous research did not place as much importance on the role as TSOs have. Third, TSOs were developing their own measures, unlike in healthcare settings. This was because some interviewees did not feel that existing PROMs developed for other settings were transferable to TSO, making it difficult to sustain their use43 and measures specifically designed for TSO are needed. The use of bespoke measures raises questions about the validity of data being collected as these PROMs have not undergone psychometric testing. TSOs were generally using paper-based PROMs which is at odds with the shift towards electronically collected measures.44 45 The variation may be because of concerns about the digital literacy of people accessing TSOs.19 46
The utility of the CFIR
Using the CFIR enhanced our understanding of the range of issues which influence implementation, especially considering the impact of the external context and an organisation’s characteristics. Without using the CFIR, we would not have identified potentially relevant issues which have arisen in respect of PROMs. For example, interviewees did not discuss planning implementation, raising questions about whether TSOs take an organic approach to implementation. However, the CFIR had less utility in respect of exploring designing the PROMs process and sustaining their use. A further limitation is that each CFIR construct is independent but we identified how implementing PROMs was a process, with the different constructs influencing each other.
When implementing PROMs, commissioners and TSOs need to consider codesigning a PROMs process which is appropriate for a specific organisation and their service users. This includes choosing an appropriate measure alongside deciding suitable ways to collect, process, analyse and use the PROMs data. It appears to be important that TSOs have an implementation lead and invest sufficient resources in processes and infrastructure such as electronic data systems and training. Commissioners could facilitate this by allocating funding for PROMs implementation as part of their funding contracts. Organisations should anticipate problems when initially implementing PROMs and be proactive in addressing these.
There were some TSOs which managed to implement PROMs despite not having all the facilitators described here, raising questions about whether certain facilitators are more fundamental than others or whether some barriers can be minimised by facilitators. The relative importance of different facilitators and barriers needs further research. The struggle to find suitable PROMS and sustain the use of PROMs could be addressed by developing and validating a measure specifically for TSOs.
To conclude, TSOs are trying to use PROMs because they feel they have no choice but often struggle with implementation. Having an implementation lead, designing an appropriate process, investing resources, training staff and taking mitigating actions to address potential barriers can facilitate implementation.
Contributors AF undertook all the recruitment, interviews and analysis alongside writing the article. AOC and JH coded a transcript and provided ongoing advice into the conduct of the study and significant input into the analysis. AOC and JH provided substantial feedback on the drafts of the article.
Funding The study has been funded through the National Institute for Health Research-Doctoral Research Fellowship (AF; DRF-2016-09-007) scheme and by the National Institute for Health Research Yorkshire and Humber Applied Research Collaboration.
Disclaimer The views expressed in this publication are those of the author(s) and not necessarily those of the NHS, the National Institute for Health Research, Health Education England or the Department of Health. The funder played no role in the undertaking of the review or the writing of the manuscript.
Competing interests None declared.
Patient and public involvement Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details.
Patient consent for publication Not required.
Ethics approval The study was approved by the School of Health and Related Research Ethics Committee (Ref: 013727).
Provenance and peer review Not commissioned; externally peer reviewed.
Data availability statement No data is available. The interview data is not available to share as this would breach the conditions of the Ethics Committee which granted approval.