Objectives To examine the impact of state/territory policy support on (1) uptake of evidence-based continuous quality improvement (CQI) activities and (2) quality of care for Indigenous Australians.
Design Mixed-method comparative case study methodology, drawing on quality-of-care audit data, documentary evidence of policies and strategies and the experience and insights of stakeholders involved in relevant CQI programmes. We use multilevel linear regression to analyse jurisdictional differences in quality of care.
Setting Indigenous primary healthcare services across five states/territories of Australia.
Participants 175 Indigenous primary healthcare services.
Interventions A range of national and state/territory policy and infrastructure initiatives to support CQI, including support for applied research.
Primary and secondary outcome measures (i) Trends in the consistent uptake of evidence-based CQI tools available through a research-based CQI initiative (the Audit and Best Practice in Chronic Disease programme) and (ii) quality of care (as reflected in adherence to best practice guidelines).
Results Progressive uptake of evidence-based CQI activities and steady improvements or maintenance of high-quality care occurred where there was long-term policy and infrastructure support for CQI. Where support was provided but not sustained there was a rapid rise and subsequent fall in relevant CQI activities.
Conclusions Health authorities should ensure consistent and sustained policy and infrastructure support for CQI to enable wide-scale and ongoing improvement in quality of care and, subsequently, health outcomes. It is not sufficient for improvement initiatives to rely on local service managers and clinicians, as their efforts are strongly mediated by higher system-level influences.
- continuous quality improvement
- primary health care
- health policy
- Aboriginal and Torres Strait Islander health
This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/
Statistics from Altmetric.com
- continuous quality improvement
- primary health care
- health policy
- Aboriginal and Torres Strait Islander health
Strengths and limitations of this study
Using a mixed-method comparative case study methodology and drawing on data from 175 Indigenous primary healthcare services across Australia, we examine the impact of state/territory policy support and strategies on (1) uptake of CQI activities and (2) quality of care for Indigenous Australians.
Our analysis of several years of data from the largest and most comprehensive continuous quality improvement (CQI) programme in Australia shows that consistent and sustained policy and infrastructure support for CQI enables wide-scale and ongoing improvement in quality of care and, subsequently, health outcomes.
Our study adds to the accumulating evidence on the conditions that enable CQI efforts to be most effective.
The authors of this paper have all had longstanding involvement with a national CQI programme as researchers, service providers, managers or policy makers/advisors.
A limitation of our study is that it is not possible to clearly attribute the extent to which trends in data on quality of care have been influenced by various concurrent policy and other initiatives.
Internationally, there is wide variation in adherence to best practice clinical guidelines between health services and between health professionals.1 There is a growing body of evidence about the effectiveness of continuous quality improvement (CQI) in increasing adherence to guidelines and on the factors that contribute to this.2 Variation in quality of care between health services has been demonstrated, including in populations with poorer health status, such as Aboriginal and Torres Strait Islander (hereafter respectfully referred to as Indigenous) peoples in Australia.3 4
Indigenous people’s health and access to primary healthcare
Australia is a high-income country with gross disparities in health outcomes between Indigenous and non-Indigenous people. This inequity has complex causes, including historical trauma and dispossession as a result of colonisation, social and economic conditions and persistent racism. While the Indigenous population is about 730 000 (3% of the Australian total), the numbers and proportion of the population vary widely between jurisdictions.5
Indigenous people access primary healthcare (PHC) through services specifically established to meet their needs—both community controlled and government managed—and private general practice.6
Positive policy environment
A recently proposed four-level framework to describe the causes of the ‘evidence–practice gap’7 backs up previous work that has called for change at multiple levels of the health system to support wide-scale improvement in the quality of care.8 While system-wide approaches to CQI have been associated with achieving large-scale improvements in health outcomes, there is limited evidence of the effectiveness of CQI over an extended period.2 A positive policy environment is widely recognised as vital for effective development and implementation of programmes to prevent and manage chronic disease,9 with previous cross-regional analyses identifying the importance of regional-level policies in enhancing clinical performance in Indigenous PHC in Australia.4 However, there is limited evidence as to the effect of government policy on the uptake and impact of CQI over time.
This paper examines the influence of health policy decisions at the Australian state/territory level and how these may have influenced: (i) trends in the consistent uptake of evidence-based CQI tools available through a research-based CQI initiative (the Audit and Best Practice in Chronic Disease (ABCD) programme) and (ii) quality of care (as reflected in adherence to best practice guidelines) in Indigenous PHC services.
National policy context: CQI in Indigenous PHC
The rapid growth since 2002 in CQI initiatives in Indigenous PHC has been supported to varying extents by several large-scale CQI programmes operating across a number of Australian states/territories, for example, the Australian Primary Care Collaboratives (APCC), Healthy for Life and the ABCD programme.10–12 As a programme of applied research, ABCD is the longest running and most extensively documented of these initiatives (box 1). To some extent, the Healthy for Life programme encouraged use of ABCD tools and processes by commissioning and promoting some of the audit tools in the programme. Similarly, engagement with APCC may have been a stimulus for services to explore the use of ABCD tools and processes and vice versa. In 2015, the Australian Government Department of Health provided funding for the development and implementation of a National CQI Framework for Aboriginal and Torres Strait Islander PHC,10 which outlines roles and responsibilities for CQI at various levels of the system.
The ABCD programme and continuous quality improvement tools
The Audit and Best Practice in Chronic Disease (ABCD) programme is a continuous quality improvement (CQI) action research project that employed a systems approach to enhancing care delivered through Indigenous primary healthcare (PHC) services across Australia.3 16 17 Commencing in 2002, ABCD brought together service providers, policy makers and researchers in a collaborative programme of applied research, with the aims of developing and enhancing the feasibility of CQI tools and processes on a wide scale, examining factors associated with variation in quality of care and strategies that have been effective in improving quality of care and working together to enhance the implementation of effective strategies. We have previously reported on factors that influence variation in quality of care between health services11 and are engaged in an ongoing programme of research on priorities and strategies for improvement.13 Supported by a national CQI support entity (One21seventy), since 2010, more than 270 Indigenous PHC services have used standardised evidence-based best practice clinical audit and system assessment tools to assess and reflect on health service system performance, typically on an annual basis. The tools have been used to varying extent in all Australian states/territories. The distribution of PHC services and increase in engagement over time is depicted in figure 1.
CQI tools developed through the ABCD programme cover priority aspects of PHC (including preventive care, diabetes, child health and maternal health). The clinical audit tools were developed by expert working groups, with participation of specialists in relevant aspects of care and health service staff.3 The tools were designed to enable services to assess their work against best practice standards as reflected in widely accepted evidence-based guidelines; each tool is accompanied by an audit protocol. The ABCD audit tools are ideally used in a system-oriented collaborative and supportive CQI approach, together with an assessment of health service system performance conducted by health service staff in a facilitated group discussion using a standardised systems assessment tool.28 The evidence of effectiveness of the ABCD CQI process11 15 19–24 is consistent with international evidence of effectiveness of quality improvement strategies.2
The ABCD programme
For the duration of its operation, the ABCD programme has had a strong focus on both developing the evidence base for CQI in Indigenous PHC and supporting implementation of evidence-based CQI practices.3 13 14 The ABCD programme, and its associated service support arm One21seventy, have been used most extensively in the Northern Territory (NT) and Queensland (QLD) by both government and community-controlled Indigenous PHC services and to a lesser extent in New South Wales (NSW), South Australia (SA) and Western Australia (WA). The timing and nature of policy and funding support for ABCD and other CQI programmes has varied between jurisdictions. The most substantial support was available in NT and QLD and was generally of smaller scale and more fragmented in NSW, SA and WA.10 14 (box 1).
We use a comparative case study design to relate state/territory level policy support for CQI to trends in its uptake and in quality of care. The five states/territories provide the ‘cases’ for comparison as they all have some consistent CQI data available through participation by services in the ABCD programme.
Information on the use of CQI processes and tools and on policy and infrastructure support for CQI initiatives is drawn from publicly available sources. Information from these documentary sources is supplemented by the experience and insights of the authors, all of whom have been closely involved (including as service providers, managers, policy makers and advisors, CQI coordinators and researchers) over an extended period in relevant CQI programmes.
Data on CQI activity and on adherence to clinical best practice guidelines were available through ABCD. This paper focuses on four priority aspects of care: preventive, type 2 diabetes, maternal care and child health. The CQI and clinical record audit processes through which data are collected and reported at health service level are summarised in box 1 and online supplementary additional file 1 and described in more detail elsewhere.3 15
Supplementary file 1
For the purpose of assessing extent of CQI activity using ABCD standard tools, we sum the number of different audit tools used in each health service in each year for each jurisdiction.
We use a composite Quality of Care Index (QCI) to measure overall adherence to evidence-based clinical best practice guidelines in the delivery of care for each audit tool over successive years. The QCIs provide a measure of adherence to a package of evidence-based practices within each area of care. They therefore provide a more holistic measure of quality of clinical care (eg, overall delivery of type 2 diabetes care) than specific items of care (for example monitoring or control of HbA1c). We report on these QCIs for only NT and QLD, as these jurisdictions had data available from a large number of health services. QCIs were calculated by dividing the total number of client services for each client by the total number of possible services in the QCI.15 We use box plots to report QCIs for participating health services by jurisdiction for consecutive years and for consecutive audit cycles for health services that completed audits for at least three cycles (online supplementary additional file 2). Data on additional cycles are reported where there were data from at least half of the health services that completed audits in at least three cycles.
As the data have a hierarchical structure (patients within health services), mixed multilevel linear regressions were run to test the effect of jurisdictional location (NT and QLD) on service delivery (as measured by the QCI). Up to four audit cycles were included in the analysis where there were sufficient numbers of health services to enable cross-jurisdictional comparison. To minimise confounding, we confined analysis to health centres that completed the same number of audit cycles within each jurisdiction. The level of service delivery to individual clients (continuous variable: percentage of QCI delivered) was modelled with health service as an additional level random effect. Each model included adjustments for year of audit and audit cycle completed. Jurisdictional location (categorical) was included as a fixed effect. Variance partition coefficients were calculated to measure how much variability in adherence to best practice guidelines between health services was attributable to jurisdictional location. Inspection of residual plots showed no obvious deviations from normality or homoscedasticity. p Values were obtained by likelihood ratio tests of the model with jurisdictional location against the empty model without this effect. A p value ≥0.05 was considered statistically non-significant. Statistical analyses were conducted with STATA software, V.14.
Ethical approval for the ABCD National Research Partnership was obtained from research ethics committees in each relevant Australian jurisdiction.3
Policy initiatives that may have influenced uptake of the ABCD CQI programme, by state and territory
A number of national CQI initiatives may have influenced uptake of ABCD along with those being implemented simultaneously by the states/territories.10 12 An overview of CQI policy initiatives, by jurisdiction, showing the greatest uptake of the ABCD CQI tools is presented in summary form in table 1 and in more detail in online supplementary additional file 3.
A total of 286 Indigenous PHC services used ABCD standard tools and reported data through the One21seventy web-based information system between 2005 and 2014. Of these health services, 175 voluntarily provided de-identified clinical audit data for analysis and reporting.
The most substantial early uptake of the CQI tools was in the NT (table 1; figure 2; online supplementary additional file 3) where they were implemented in 12 health services following the first evidence of their success.3 There was a decline in the use of the tools in the NT in 2010, the final year of the extension phase of the ABCD research project, followed by a large increase in use the following year. This increase coincided both with the establishment of One21seventy as a service support agency for using ABCD CQI tools and processes and with the commencement of the NT CQI Strategy and corresponding funding support. The use of ABCD CQI tools plateaued over the period 2012–2014. An external evaluation commissioned by the NT government supported sustainability and embedding of processes.16
In QLD, use of the ABCD CQI tools commenced in 2007/8, with the engagement of QLD Health and some community-controlled PHC services (largely in the north of the state) in the ABCD programme (table 1; figure 2; online supplementary additional file 3). This followed an internal review of evidence on improving healthcare delivery and subsequent recommendations to increase investment in CQI in 2008 and again in 2010. There was a rapid increase in the use of the tools to a peak in 2011 and 2012, following the second investment by QLD Health in CQI coordinators and facilitators and in supporting health services to access ABCD tools and the One21seventy web-based information system. There was a marked decline in the use of the ABCD CQI tools in 2013 and 2014, following the change in government in 2012, a lack of policy support and cuts in funding.
New South Wales
Use of the ABCD CQI tools in NSW peaked in 2008 and 2009 but declined as the state’s early leading exponent of CQI, Maari Ma Health Aboriginal Corporation in Broken Hill, shifted attention to using the ABCD audit tools in selected aspects of clinical care and applying CQI techniques to the management of various organisational systems and processes (table 1; figure 2; online supplementary additional file 3). There was some continuing use of ABCD CQI tools in Maari Ma Health and in other NSW services despite the absence of direct support for the use of these tools from NSW health authorities.
In WA, use of the ABCD CQI tools increased from 2005 to a peak in 2008 and 2009 across several health services (table 1; figure 2; online supplementary additional file 3). The decline in usage coincided with the end of ABCD’s extension phase, but a number of health services continued to use the tools despite relatively limited engagement with ongoing research and no direct support from WA health authorities.
A small number of services used the ABCD CQI tools in SA between 2006 and 2010 and slightly more between 2011 and 2014—the increase coinciding with provision of limited funding and policy support from research and SA health (table 1; figure 2; online supplementary additional file 3). This policy support occurred after an internal review (similar to QLD) on the evidence and best options to improving delivery of care.
Trends in quality of care
The QCIs of adherence to best practice guidelines for health services in the NT generally show improvement over audit cycles and over successive years. More specifically, between audit cycles 1 and 4, the median % of services delivered for participating health centres increased by more than 25% for overall preventive care and by about 10% for overall type 2 diabetes care and overall child healthcare (online supplementary additional file 2; table 2). There was also improvement in the median % of services delivered in successive years for all four areas of care. The improvement in NT is accompanied by a reduction in variation between health services for preventive care and child health QCIs, due to improvement among poorer-performing health services.
In QLD, the QCIs of adherence to best practice guidelines show a mixed picture. There was improvement in the median % of services delivered for participating health services between audit cycles 1 and 4 of about 15% for overall antenatal care. For overall type 2 diabetes care and overall preventive care, there was an increase in the median % of services delivered of about 10% and 5%, respectively, between audit cycles 1 and 3, followed by a decline at audit cycle 4 (online supplementary additional file 2; table 2). There was no clear trend for diabetes care over successive years or over audit cycles or for preventive care over time. There was a declining trend over successive years and no clear increasing or decreasing trend over audit cycles for child health. Nor was there a clear reduction in variation between health services in any of the four areas of care over time or over audit cycles.
The multilevel linear regression analyses showed that there was a significant difference between the two jurisdictions for preventive and diabetes care. After adjusting for year of audit and number of cycles completed, the predicted increase in adherence to best practice for NT compared with QLD health services was 12% (95%CI 5.61 to 17.70; p<0.0001) and 16% (95%CI 11.87 to 19.58; p<0.0001) for preventive and diabetes care, respectively. Jurisdictional location accounted for 17% and 18.2% of the explained variability in adherence to best practice guidelines for both. There was no significant difference between jurisdictions in relation to child or maternal care (table 3).
Progressive and sustained uptake of ABCD tools occurred in the NT in the context of consistent long-term policy and infrastructure support for CQI. This contrasted with (a) a rapid rise and subsequent fall in uptake of these tools in QLD where the initial high-level policy and infrastructure support was not sustained following a change of government in 2012 and (b) low levels of uptake in jurisdictions with relatively less policy and infrastructure support (NSW, WA, SA). The consistent long-term policy and infrastructure support for CQI in the NT was also associated with steady improvements or maintenance of high-quality care (as reflected in clinical best practice guidelines) for the four aspects of care that were the major focus of ABCD CQI efforts and reduction in variation between health services for two of these. This contrasted with the situation in QLD where there was a relatively limited effect on adherence to best practice guidelines and on variation between health services.
While this study does not provide an in-depth examination of the complex processes that might explain different trends in the uptake of tools or how CQI processes have impacted on quality of care in different jurisdictions, some insight has been provided by previous studies of the ABCD CQI programme11 15–24 and the evaluation of the NT CQI Strategy.18 Gardner et al highlighted the complexity of the process of uptake of CQI and the critical role of alignment of policies and incentives, a systems approach, organization-wide commitment, leadership at all levels and resources to support implementation.19 Our findings of relatively low uptake of CQI in jurisdictions with limited policy and infrastructure support and the rapid drop in use of CQI tools when policy, infrastructure and funding support was withdrawn in QLD highlight the critical role these play in supporting its uptake. In these states, the lack of clear and consistent policy direction, resourcing and sustained high-level leadership and management support for CQI and relative lack of engagement in wide-scale CQI research have led to a diversity of locally driven initiatives with an associated lack of systematic analysis and reporting of data for CQI purposes. This appears to have been a barrier to demonstrably effective uptake of CQI in many Indigenous PHC services between 2005 and 2014.
The limited availability of data for systematic analysis and reporting of relevant data, other than in QLD and NT, has precluded meaningful analysis of adherence to best practice guidelines for most states/territories. The first report on national Key Performance Indicators (nKPIs) from Indigenous PHC organisations showed that, in 2012–2013, those in QLD and the NT performed better against almost all process-of-care indicators,25 attributing this to the relatively well-established CQI programme in these jurisdictions. The third and most recent nKPI report, which includes data up to December 2014,26 shows improvements for 17 of the 19 process-of-care measures for all jurisdictions combined, with continued relatively high performance in NT and QLD and most marked recent improvement in WA. The analysis presented in this paper points to the importance of high-level policy support and resourcing for implementation of systematic CQI processes to enhance quality of care. The relatively high performance and the greater ability to report nKPI data in NT and QLD demonstrate the benefits of systematic CQI processes for reporting of data on KPIs as well as for enhancing quality of care.
The independent evaluation of the NT CQI Strategy provides important insights into the relative success of CQI initiatives in the NT. There has been no comparable publicly available independent evaluation in QLD, NSW, WA or SA, and it may be that an external evaluation such as that of the Strategy plays a role in ensuring sustainability and momentum. The formalised collaborative engagement of the community-controlled and government sectors in the NT through the Aboriginal Health Forum and the shared commitment and enthusiasm for a territory-wide CQI Strategy have also contributed to the achievements in the NT. Given the importance of working effectively together to respond to the complex care needs of Indigenous patients, it appears that a partnership approach adopted across service sectors is a critical component underpinning efforts in improving quality of care.
Another important component has been the adaptation of collaborative methods to sustain the engagement of experienced front-line service providers and managers, such as bringing them together to share learnings. Together with sustained investment, the shared commitment and enthusiastic engagement in CQI in NT is likely to have engendered the sense of collective efficacy and collective valuing of CQI data that has led to the effectiveness of CQI.11
An important limitation of our study is that it is not possible to determine clearly the extent to which trends in data on quality of care have been influenced by policy support for the ABCD CQI programme or to other initiatives (eg, funding, workforce or infrastructure developments). The difficulty of demonstrating causality is common to much policy research27; however, we argue here for contribution rather than attribution. Improvements to the quality of care in NT built on substantial earlier initiatives, including electronic patient information record systems, the development and implementation of a Chronic Disease Strategy and sustained commitment to workforce development.
The ABCD data are not representative of all Indigenous PHC services. There was variable participation in different jurisdictions and by government-operated and community-controlled health services. For example, in the NT, there were substantial numbers of both service types participating in ABCD, but relatively low numbers of community-controlled services in QLD. The ABCD data need to be interpreted in relation to a range of other CQI activities in Indigenous PHC services over the period for which data have been reported.10 12 While there were some substantial initiatives, particularly in NT and QLD, most CQI initiatives were small scale, narrow in scope and without the capability to analyse and report consistent data to the extent possible through ABCD. Nor has it been possible to assess systematically these CQI activities or their impact on quality of care. In addition, there were a range of non-CQI initiatives at the national (eg, Indigenous Chronic Disease Package)28 and local levels, which may have impacted on quality of care over the period for which we have reported data. More generally, as with all research of this type, it is vital to consider historical, socioeconomic and health service and system contexts in assessing the generalisability or transferability of the findings to other PHC settings in Australia or internationally.
The authors of this paper have all had longstanding involvement with the ABCD programme as researchers, service providers, managers or policy makers/advisors. While our interest in ABCD may have influenced our interpretation of the data, the diversity of roles, insights and perspectives that we bring allows for critical reflection in the interpretation of the data and brings rigour to this type of research.27
The ABCD experience, as reflected in this paper, has important implications for practice, policy and further research, including the implementation of the National CQI Framework for Aboriginal and Torres Strait Islander PHC.10 For clinical staff and management of health services, the benefits of participating in this type of collaborative programme include access to a CQI system that provides data on recent performance and trend data across the broad scope of primary care and the ability to benchmark against other services at the regional, state/territory and national level. For policy professionals, benefits include the ability to monitor adherence to best practice guidelines at all levels and to target improvements to specific aspects or modes of care,24 population groups (eg, children or the elderly) or geographic locations. An important challenge for ongoing and new CQI initiatives is to enhance local ownership and engagement, while ensuring the use of standard tools and supporting the analytical capability that enables the use of consistent good-quality data for CQI purposes at multiple levels of the system. Sustaining efforts to deliver the best care according to changing evidence over time remains important and warrants further attention.
Our study adds to the accumulating evidence on the conditions that enable CQI efforts to be most effective. The findings show the potential contribution that systematic and sustained policy and infrastructure support can make to wide-scale uptake and to the effectiveness of CQI methods in improving the quality of care. It is now about 10 years since our first published paper on the potential for CQI to enhance the quality of healthcare for Indigenous Australians. With the development of a National CQI Framework in 2015,10 it appears we may be at the dawn of a new era of wide-scale and systematic use of CQI methods. While local efforts are vital to the effective use of CQI methods, state/territory-level policy and resources will be critical to building capability and a supportive environment.
The development of this manuscript would not have been possible without the active support, enthusiasm and commitment of staff in participating primary health care services and members of the ABCD National Research Partnership and the Centre for Research Excellence in Integrated Quality Improvement.
Contributors RB conceived and had the primary role in drafting of the manuscript. VM undertook the quantitative data analysis and had a major role in drafting and review. All other authors (SL, ST, CP, TW, JB, FC, RK, LC) played substantial roles in providing information on QI initiatives in various states and territories, in analysis and interpretation of data and review of successive drafts of the manuscript. All authors read and approved the final manuscript.
Funding The National Health and Medical Research Council has funded the Audit and Best Practice for Chronic Disease National Research Partnership Project (#545267) and the Centre for Research Excellence in Integrated Quality Improvement (#1078927). In-kind and financial support has been provided by the Lowitja Institute and a range of community-controlled and government agencies.
Competing interests RB was the scientific director of One21seventy, a not-for-profit entity within Menzies School of Health Research that provided CQI support on a fee-for-service basis to primary healthcare services across Australia. RB is also the lead investigator on the Audit and Best Practice for Chronic Disease research programme, and other authors are co-investigators. None of the authors received financial support from One21seventy, and One21seventy did not provide any financial support for the preparation of this manuscript. The authors have no other competing interests in the preparation of this manuscript.
Ethics approval Ethics approval was obtained from research ethics committees in each jurisdiction (Human Research Ethics Committee of the Northern Territory Department of Health and Menzies School of Health Research (HREC-EC00153); Central Australian Human Research Ethics Committee (HREC-12-53); New South Wales Greater Western Area Health Service Human Research Committee (HREC/11/GWAHS/23); Queensland Human Research Ethics Committee Darling Downs Health Services District (HREC/11/QTDD/47); South Australian Aboriginal Health Research Ethics Committee (04-10-319); Curtin University Human Research Ethics Committee (HR140/2008); Western Australian Country Health Services Research Ethics Committee (2011/27); Western Australia Aboriginal Health Information and Ethics Committee (111-8/05); University of Western Australia Human Research Ethics Committee (RA/4/1/5051)).
Provenance and peer review Not commissioned; externally peer reviewed.
Data sharing statement The Audit and Best Practice for Chronic Disease dataset analysed during the current study is not publicly available due to health centre confidentiality but is available from the corresponding author on reasonable request and if consistent with the project’s ethics approvals.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.