Objectives While national quality registries (NQRs) are suggested to provide opportunities for systematic follow-up and learning opportunities, and thus clinical improvements, features in registries and contexts triggering such processes are not fully known. This study focuses on one of the world's largest stroke registries, the Swedish NQR Riksstroke, investigating what aspects of the registry and healthcare organisations facilitate or hinder the use of registry data in clinical quality improvement.
Methods Following particular qualitative studies, we performed a quantitative survey in an exploratory sequential design. The survey, including 50 items on context, processes and the registry, was sent to managers, physicians and nurses engaged in Riksstroke in all 72 Swedish stroke units. Altogether, 242 individuals were presented with the survey; 163 responded, representing all but two units. Data were analysed descriptively and through multiple linear regression.
Results A majority (88%) considered Riksstroke data to facilitate detection of stroke care improvement needs and acknowledged that their data motivated quality improvements (78%). The use of Riksstroke for quality improvement initiatives was associated (R2=0.76) with ‘Colleagues’ call for local results’ (p=<0.001), ‘Management Request of Registry data’ (p=<0.001), and it was said to be ‘Simple to explain the results to colleagues’ (p=0.02). Using stepwise regression, ‘Colleagues’ call for local results’ was identified as the most influential factor. Yet, while 73% reported that managers request registry data, only 39% reported that their colleagues call for the unit's Riksstroke results.
Conclusions While an NQR like Riksstroke demonstrates improvement needs and motivates stakeholders to make progress, local stroke care staff and managers need to engage to keep the momentum going in terms of applying registry data when planning, performing and evaluating quality initiatives.
Statistics from Altmetric.com
Strengths and limitations of this study
A survey providing novel insight into what facilitates clinical quality improvements with regard to quality registries.
Represents a study with a good response rate, using a validated survey, from across almost all units' stakeholders in one of the largest registries on stroke worldwide.
While national quality registries (NQR) are more common in countries like Australia, Sweden and the UK, the findings may be applicable to users of other medical registries.
Representing a well-established NQR, findings from Riksstroke may not illustrate barriers in developing registries and/or their use in clinical practice.
Systematic collection and analysis of performance data is a commended approach for monitoring quality of care and identifying areas of improvement.1 Many countries have thus introduced medical registries to improve healthcare quality.2–4 Sweden has an extensive track record of national quality registries (NQRs).5 Providing for individual-based data entries on particular diagnoses, treatment interventions and outcomes, NQRs offer opportunities to monitor and thus improve healthcare quality.6
The NQR on stroke, Riksstroke, represents a renowned diagnosis-based registry. It was established in 1994, and since 1998, all hospitals providing stroke care partake in the registry, including 25 000–26 000 unique care episodes each year.7 Riksstroke comprises the acute care following a stroke and follow-up at 3 and 12 months after discharge for each individual, including medical aspects as well as the multiprofessional stroke care process. It currently contains over 450 000 stroke events, making it one of the world's largest stroke registries.7
While Riksstroke is said to provide opportunities for systematic follow-up and learning opportunities,8 ,9 neither this nor other NQRs have proven to be the expected drivers of local quality improvement. The local focus is often on entering complete data, while local analysis and initiation of improvements by the data is less common.10 Thus, the most recent national subsidisation of NQRs is accompanied by the prospect that NQRs will aid facilitation of continuous quality improvement, cultivating effectiveness and balancing differences in quality of care between health providers.11 However, the complete picture as to how and when NQRs contribute to or initiate such processes is pending. Internationally, factors such as registry coverage, methods for data collection and the definition of variables are still discussed and compared between national stroke registries. Furthermore, a recent review concluded that there is uncertainty about how NQRs on stroke feedback on the quality of care to hospitals or patients; there is also a lack of detail on how data from such registries are used in quality improvement.12
Previously, using Riksstroke as a case in a series of qualitative studies, we found barriers and facilitators for quality improvement within the registry itself and in the interplay between inner and outer stroke care contexts.13–15 Beyond particular stroke process projects, the use of Riksstroke was ambiguous and highly dependent on devoted professionals in stroke units and among stakeholders at the politicoadministrative level. While these studies provided a profound understanding, including a sample of stroke care in Sweden, a more comprehensive understanding of how an NQR like Riksstroke promotes quality improvement is needed. This study investigates what aspects of Riksstroke and healthcare organisations facilitate or hinder the use of registry data in clinical quality improvement.
This quantitative study is the second phase of an exploratory sequential design.16 Previous qualitative findings exposed several factors for further investigation: the organisation's context; the individuals involved in local NQR work; the stroke healthcare process; data registration; data analysis; and experiences applying the NQR for initiating change.13 ,14 From these studies and a literature review, we produced a national survey. The survey was in Swedish, but an overview of the content and structure is presented in English (see online supplementary file I). The complete survey can be obtained from the research team.
The preliminary survey was tested for content validity and response process validity in three phases in January through May 2014.17 Initially, the research team examined the content validity in a workshop. Second, another six healthcare researchers external to the team examined the survey's structure, content, layout and responses in individual think-aloud interviews.18 The input prompted minor changes to the wording of questions and response options. Third, the survey was tested in its target population, including five NQR users from across Sweden, all in charge of the local work in their units using three similar NQRs. They were appointed for individual telephone interviews; at the start of each interview, respondents received the survey by email, in accordance with the planned distribution for the main study. They were prompted to respond to the survey and to think aloud on its structure, content and layout. The test resulted in minor changes regarding wording and a reordering of certain items.
The final survey was designed in a web survey program (LimeSurvey, V.1.90+) and comprised 50 questions organised in 7 sections: (A) Background information about the respondent; (B) Quality of care; (C) Data quality; (D) Organisational conditions; (E) The respondent's use of registry data; (F) The stroke unit's use of registry data and (G) Perceived value of the registry. We mainly used a Likert scale approach for the responses, with five alternatives ranging from ‘Strongly Disagree’ to ‘Strongly Agree’. However, section B partly applied a five-alternative Likert scale ranging from ‘Very Poor’ to ‘Very Strong’, and section E partly applied a four-alternative frequency scale ranging from ‘Never’ to ‘Often’. Each section included an opportunity to provide additional remarks in free text, and the survey program allowed for each section to appear consecutively.
Sampling and procedure
At each stroke unit, the survey was sent to: (1) the head of the clinic, (2) the physician(s) in charge of Riksstroke (or, if there were none, the physician in charge of the stroke unit) and (3) the registered nurse(s), licensed practical nurse(s) and/or medical secretary (if any) in charge of registering local Riksstroke data. To identify respondents, the national Riksstroke registry administration shared their inventory of all 72 hospital units providing stroke care in Sweden and the name and address of the contact person at each stroke unit. From this information, we identified potential recipients and obtained names and email addresses, including at least two and at most five individuals per stroke unit (mean 3.5).
The survey was distributed via email in September 2014. After 2, 3 and 4 weeks, respectively, corresponding reminders were sent to those who had not yet replied. A final reminder was sent after week 5 that included an opportunity to provide reasons for not partaking. Individual consent to participate was achieved by the voluntary completion and submission of the survey.
Independent and dependent variables
We identified sets of dependent and independent variables (indexes) by processing theoretical knowledge and clinical experience, including our previous qualitative studies,13 ,14 and a literature review; all indexes are outlined in online supplementary file II. Essentially, an index was created as a dependent variable that depicted the healthcare unit's use of registry data as reported by the respondents (Cronbach's α=0.89). The following indexes, serving as independent variables, were constructed to capture: Support from Outer Setting; Management Request for Registry Data; Management Involvement in Registry-based Quality Improvement; Data Quality and Usefulness, and Resources. In addition, a number of single questions (items 8, 24, 28, 29, 30, 31 and 46) were included as independent variables comprising: the unit's local results; support from the local department and the registry; simplicity of retrieving data from the registry and explaining the results to colleagues and managers; motivation and colleagues' interest in Riksstroke data.
Validation of indexes
A factor analysis was conducted using SPSS V.23 to validate that our indexes contained relevantly grouped individual items. The factors were first extracted using direct oblimin rotation. The Kaiser-Meyer-Olkin Measure of Sampling Adequacy was 0.75, indicating that a factor analysis was appropriate for the material, while Bartlett's Test of Sphericity had a significance of 0.000, indicating that the data were appropriate for factor analysis. The highest correlation between our factors was 0.35, validating the use of the direct oblimin rotation. The scree plot suggested using four factors, but performing the exploratory factor analysis to validate our five indexes, we chose to extract five factors. The factor analysis generally validated our scales as seen in table 1. The extracted factors had a high degree of correspondence with those constructed on theoretical bases a priori. As a final test, we calculated the Cronbach α (using SPSS, V.23) on our indexes, identifying a range from 0.73 for ‘Data Quality and Usefulness’ to 0.91 for ‘Management Request of Registry Data’. Details are found in table 1 and online supplementary file II.
A descriptive analysis of individual respondents' demographics and responses was conducted using SPSS V.23, dichotomising the items with a cut-off at Agree. A descriptive analysis of the independent variables used in the regression analysis was also conducted. Using STATA V.13, a multiple linear regression analysis was performed. The chosen unit of analysis was ‘stroke unit’ (not individual respondent) to avoid stroke units with more respondents having a larger impact on the results. Normal distribution of the residuals was verified (the sk-test and Shapiro-Wilk test) and the test for heteroscedasticity (the Breusch-Pagan test) could not reject constant variance. We used the forward selection criteria to determine the order of inclusion in the stepwise regression and then the nestreg command to determine the change in R2.
Response rate and demographics
The survey was sent to 242 individuals, 163 of whom responded (67%), representing 70 of the 72 Swedish hospitals with stroke units (97%). Most respondents were registered nurses, followed by physicians and managers and completed more than one task with Riksstroke, for example, data registration and data analysis (see table 2). A vast majority had been engaged with the local Riksstroke for 3 years or longer, indicating potential for experience with full annual cycles of reporting, feedback and analyses. Those who did not respond (but specified why) were mainly managers who reported not working with Riksstroke enough to respond to the survey.
Aggregating the response alternatives ‘Strongly agree’ and ‘Agree’, most respondents felt Riksstroke provided data for identifying areas in need of improvement (88%) and reported using Riksstroke data to do so (76%). Slightly fewer, 63%, reported performing local analyses of their data in Riksstroke, but only 42% reported having enough resources, for instance, time and skills, in the stroke unit to analyse their data. Even with this potential lack of resources, 61% of respondents reported that they retrieve data and 68% that they participate in data analysis. A slight majority (59%) reported that their manager supports quality improvement based on their unit's data and still more (73%) that their managers request data from the registry. While 63% considered it simple to explain data to fellow staff and managers and 79% presented registry data to others, only 39% reported that their colleagues call for Riksstroke results from their unit. All details are represented in table 3.
Multiple regression results
Using the index of the healthcare unit's use of registry data as a dependent variable, three independent variables were found to be significant: one index ‘Management Request of Registry Data’ (p=<0.001), and two single items: ‘It is simple to explain our department's results to colleagues and managers’ (p=<0.001) and ‘Our results are called for by staff members’ (p=<0.001). These three variables explained 75% of the total variance (R2=0.75). Neither data quality nor resources were found to be significant for the unit's use of Riksstroke for quality improvement (see table 4). Using stepwise regression, we could see that ‘Our results are called for by Members of Staff’ had the highest impact on explained variance, followed by the index ‘Management Consideration of Data’ (see table 5).
While quality registries are suggested as a vehicle for improving quality of care, the complete picture of how and when registries inform or drive these processes has not been fully appraised. Riksstroke is often employed in research19 and thus contributes to better care for patients with stroke. However, as with many healthcare innovations, it is not fully known if, how, where and when the NQR is applied in clinical practice20 and what lies behind its effectiveness in improving care, although organisational factors are generally pointed out as important.21 In previous qualitative studies, we found that health professionals and decision-makers depicted contextual factors at the stroke unit, hospital and regional levels to affect the use or lack of use of Riksstroke to improve stroke care.13 ,14 Additional features were found in this study to further illustrate the application of Riksstroke in local quality initiatives. Primarily, the role of managers and coworkers will be considered, along with the limited support this study provides for the notion that resources and data quality shape quality improvement.
Besides research, local quality improvements are needed to advance healthcare. Access to local data is crucial for quality improvement.21 Our findings emphasise that recipients need to understand their local performance in conjunction with healthcare quality to capture improvement needs.22 While an NQR like Riksstroke can provide stroke units with opportunities to access their local longitudinal data on aggregated levels, and to benchmark their care to national standards and/or other stroke units,23 feedback should be managed in groups of peers, with repeated communication on the data to feed improvement initiatives.24 The registry can then function as a platform to improve outcomes by engaging physicians and other clinical staff in the shared task of improving the quality of care.25
Although Sweden and other countries like Australia and the UK have invested in NQRs like Riksstroke,26 ,27 most efforts focus on securing data quality.13 ,14 For future progress, quality improvement initiatives must focus on enhancing improvement knowledge and skills, an assignment beyond stroke care expertise.28 A comparison between Sweden and the USA suggests that the Swedish registries are prone to foster clinical quality improvement, given the accommodating regulations and resources provided at the national level. However, the US system with, for example, automated data capture allows resources to be spent on improvement initiatives, rather than data registration.29 Registry expertise and experience shared across countries could stimulate further development in how to use comprehensive process and results data in improving, for example, stroke care.30 In Sweden, one of the limitations of registries such as Riksstroke is evidently the burden of registering data.29 This is most likely reflected in that merely 65% of the Riksstroke respondents considered the gain from partaking in the registry, justifying the resources spent on working with it. Implementing automatised data capture could shift resources from securing data to data-led quality improvement work; however, to facilitate clinical improvement, health professionals, managers and policymakers need further support and opportunities to engage in joint ventures.15
A closer look at the results reveals a complex picture: while neither data quality nor resources were significantly correlated with the use of NQR data in local quality improvement, more professionals involved in Riksstroke reported that they themselves use data to improve quality than their stroke unit using data for this purpose. The limited engagement from colleagues and the obvious influence of the use of data on local quality improvement suggests the image of a lone stroke expert deciphering local data, while the stroke team members are unaware of the opportunities for quality improvement at their fingertips. Local Riksstroke stakeholders aggregate and present data to peers and managers and find this rather simple. However, this does not seem to increase engagement from peers. Our previous study showed that staff members engaged in Riksstroke at the stroke unit level are aware of the need to identify unique selling points to involve their colleagues.13 However, more collaborative efforts and an understanding of quality improvement are necessary if the data are to help improve stroke care and not just provide feedback. Managers are often considered key to support clinical quality improvement,13 ,14 ,31 which our findings also support. However, our results show that peer support is just as important, if not more so, to keep up the momentum to improve stroke care based on an NQR like Riksstroke. This factor had the strongest association with the unit's use of Riksstroke data for quality improvement. The need for team collaboration and support among coworkers is congruent with findings from studies on quality improvement,32 suggesting that successful quality improvement is a joint effort and support from others is a motivating factor for facilitating improvement.33 The importance of interplay between the adoption of innovations34 by individuals and organisations further emphasises the motivating impact of others being engaged in the same issues as oneself. Improvements are social processes, and relationships and communication are thus significant for quality improvement. Leaders are important in quality improvement,35 but locally appointed staff working with the registry apparently need staff members to engage to improve stroke care.
Sweden has a universal, comprehensive and tax-based healthcare system similar to those of larger nations like Australia, the UK and Canada. As a result, experiences with NQRs in Sweden may be relevant to registry initiatives in other countries. Riksstroke is a well-known and acclaimed registry, giving this study the potential to pinpoint factors that facilitate quality improvements to stroke care and other similar registries.
The match between the indexes constructed a priori and the factors identified in the factor analysis worked out relatively well. To facilitate the interpretation of the regression analysis, we chose to keep the theoretically constructed indexes instead of using the factor solution. Given our cross-sectional design, the results cannot distinguish between cause and effect. While we have not tested for causation, it is reasonable to believe that the identified associations are not unidirectional, but rather that there are feedback loops.
Previous studies have shown that besides being a rich source for research, an NQR such as Riksstroke can provide opportunities for local stroke care quality improvement. This study represents 97% of all stroke units across Sweden and a broad scope of managers, physicians and nurses involved in the local assignment with Riksstroke; we found that most participants considered Riksstroke to enable comparisons using relevant and reliable data, and resources spent on Riksstroke to be worthwhile. Yet, data analyses and quality improvements based on the data received less attention than the registration of data. In addition, the use of Riksstroke data for quality improvement initiatives was strongly related to the interest and engagement of fellow stroke care staff and managers. This is a call for further initiatives to engage entire stroke teams in enhancing the potential for applying registry data in planning, performing and evaluating initiatives to improve stroke care.
The authors are grateful to the researchers and clinicians who participated in the validation of the survey tool and to all respondents who completed the survey.
Contributors Validation was performed by SV. TD performed the analyses in dialogue with ACE, LW and MF. ACE, UW, LW and MF attained funding for the study. ACE drafted the paper and completed it in collaboration with all authors, who have approved the final version prior to submission. All authors participated in designing the study, drafting and testing the survey.
Funding The research leading to these results was supported by the Swedish Association of Local Authorities and Regions (SALAR).
Competing interests None declared.
Ethics approval The regional ethical board, Uppsala, Sweden (2013/181).
Provenance and peer review Not commissioned; externally peer reviewed.
Data sharing statement The complete data set is available at Uppsala University, Sweden.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.