Statistics from Altmetric.com
Strengths and limitations of this study
This is the first UK study that looked at the effects of striking junior doctors, as well as the first to evaluate the impact of withheld inhospital emergency services (the April 2016 strike was the first ever UK strike to include emergency care).
This was a large analysis of English hospital administrative data from the 2016 strikes, which showed a significant impact on outpatient appointments, admitted patient care and accident and emergency visits.
This work did not include any financial modelling of the impact of industrial action on either the national or the regional level.
The study was unable to examine the health impacts on patients who could not attend hospitals due to industrial action. Additionally, qualitative outcomes such as disappointment and inconvenience were not collected.
In each of the first 4 months of 2016, junior doctors from all specialties in England engaged in industrial action with a series of 24–48 hour strikes, culminating in a 2-day strike that included the withdrawal of emergency services.1 The purpose of the action was to protest new contractual changes for all junior doctors brought by the Department of Health (DH) regarding safe working hours and pay.2 Doctors’ strikes in the UK are very rare—before the 2016 strikes, there had been only one, much smaller strike in the previous 40 years (in 2012).3 4
Breaks from routine care patterns offer an important window into the effectiveness of currently established treatment services. During annual meetings of the American Heart Association and the American College of Cardiology, there are significant drops in 30-day mortality among high-risk patients admitted with heart attacks or cardiac failure.5 Therefore, the 2016 strikes provide an ideal opportunity to evaluate the effectiveness of current systems and locate weaknesses in national responses to staffing shortages.
Metcalfe et al 6 studied strikes among doctors in the USA, Israel, Spain, Croatia, South Africa, India and the UK. Almost all of the strikes they looked at showed little to no effect on patient mortality. In fact, only one (a 20-day long strike of all doctors in a single province in South Africa in 2010) reported increased mortality rates—patients who presented in emergency departments were 67% more likely to die than during a normal period.6 Ruiz et al 3 analysed the 24-hour strike on 21 June 2012 in England, where approximately 8% of doctors in England took part.7 For their analysis, they used Hospital Episode Statistics (HES), the national hospital administrative database for England’s National Health Service (NHS), and compared the week of the strike with the week immediately following and preceding it. Their analysis found an increase in outpatient appointment cancellations, but no significant differences in mortality between strike and non-strike periods.
This current work aimed to examine the impact of the junior doctors’ strikes in early 2016 using HES, which contains data on NHS activity. This data set allowed us to investigate trends in the number of admissions (inpatients), outpatient appointments cancellations, accident and emergency (A&E) attendances, and inhospital deaths during strike periods and to compare these with the expected numbers based on an average non-strike period.
HES includes details of all admissions to NHS hospitals in England and are collected by the DH. HES data covering all recorded episodes of admitted patient care, outpatient appointments and A&E attendances were extracted for the week of each strike. Strike action by English junior doctors took place on four occasions throughout early 2016—12 January, 10 February, 9–10 March and 26–27 April. For comparison with normal operations, we also extracted all data from the weeks immediately preceding and following each strike. For simplicity, weekends were excluded from our analysis.
Due to the impact on normal hospital operations and attendance due to the bank holiday on the week of 2 May, the second comparator week was replaced with the week of 9–13 May for the April strike.
Each hospital admission is recorded as a ‘spell’ consisting of a number of ‘consultant episodes’, which denotes period of care under different consultants during their hospital admission.8 If the patient admission includes transfer to other hospitals before they are discharged, the whole period of care is recorded as a ‘superspell’. For our analysis of admitted patient care, only the first ‘episode’ in a superspell of care was used to identify the date of initial admission, so as to avoid multiple counting.
The data from the comparator weeks were averaged into what was assumed to be a ‘normal’ week. This allowed for comparison with the strike data to provide an indication of the impacts of individual strikes.
Daily totals for A&E attendances, outpatient appointments and hospital admissions were calculated. Admitted patients were separated into elective and emergency categories using the ‘admimeth’ method of admission field in HES. Day surgery cases were extracted using the ‘CLASSPAT’ field.
Outpatient appointments were analysed using the ‘attended’ field, which includes the following categories:
0=not applicable—appointment occurs in the future.
2=appointment cancelled by, or on behalf of, the patient.
3=did not attend—no advance warning given.
4=appointment cancelled or postponed by the healthcare provider.
5=seen, having attended on time or, if late, before the relevant care professional was ready to see the patient.
6=arrived late, after the relevant care professional was ready to see the patient, but was seen.
7=did not attend—patient arrived late and could not be seen.
This analysis, as with Ruiz et al’s analysis of the June 2012 strike,3 focused primarily on category 4 for cancellations, and categories 5 and 6 to denote actual attendance of appointments.
To obtain death counts, the discharge method field (‘dismeth’) was used to capture deaths in hospital. The A&E attendance disposal field (‘aeattenddisp’) was used to determine which patients died within the A&E department. HES outpatient data do not have the capability to record deaths during appointments, so these data were not used for this outcome.
Finally, regional analyses were performed on all outpatient, A&E and admitted patient data using provider code data (‘procode’).
For our analysis, it was assumed that patient counts were described by a Poisson distribution, as all values are discrete non-negative integers. A χ2 test was used to evaluate significance for proportions. All P values <0.05 were considered statistically significant.
No patients were involved in setting the research question or the outcome measures, nor were they involved in the design and implementation of the study. There are no plans to involve patients in the dissemination of results.
In total, this study involved the extraction and analysis of 3.4 million admissions, 27 million outpatient appointments and 3.4 million A&E attendances over 12 weeks.
Table 1 shows the impacts of the industrial action of early 2016 on admitted patient care, outpatient appointments and A&E.
Figure 1 shows the percentage change in A&E visits, admitted patients and outpatient appointments across all four strikes against the average of the chosen comparator weeks.
The largest impacts on normal operations were seen in the April strike. This is to be expected, as it lasted 48 hours and was also the only instance where emergency care was also withheld.
During the April strike, there was a decrease in total admissions of 18 194 patients compared with the expected volume for that period, which comprised a 7.8% decrease in emergency admissions and a 19.9% decrease in elective admissions. Day cases showed a reduction of 9846 patients (18.7%) compared with the comparator weeks.
Furthermore, 109 915 (11.1%) fewer outpatient appointments were scheduled during the strike period than usual, and 134 711 (17.1%) fewer outpatient appointments were attended during strike days. This was paired with an increase in provider cancellations of outpatient appointments of 43 823 (+66.8%). Additionally, fewer patients attended A&E during this period, with 17 325 (−14.7%) fewer attendees than expected.
The first strike that occurred (Tuesday, 12 January 2016) also showed a large and significant (9.6%) decrease in A&E attendance, despite A&E services operating normally during this time. This may be due to significant media attention for this particular strike, due to its historic significance. Additionally, some providers warned patients to avoid hospitals ‘unless absolutely necessary’.9 This effect diminished during the February and March strikes.
Figure 2 shows the impacts of strikes on the numbers of appointments cancelled or postponed by healthcare providers. The largest impacts were seen during the January and April strikes, which showed increases of 54.5% and 66.8% compared with expected cancellations. Our analysis found that 101 109 outpatient appointments were cancelled due to strike action in 2016.
Table 2 shows the differential regional impacts of the strikes on both outpatient appointments and cancellations. This analysis shows particularly large increases in outpatient cancellations by healthcare providers in London, the South East Coast and Yorkshire and the Humber. These areas also showed large decreases in the number of overall outpatient appointments attended. The East Midlands also showed a large decrease in overall appointment attendance. The south central region showed a much smaller increase in cancellations than others.
The table also shows the regional variation in admitted patient care during strike periods, separated by elective admissions (including day cases) and non-elective emergency admissions. As with outpatient appointments, the most prominently affected regions by the strikes for elective admissions were Yorkshire and the Humber, London and the East Midlands, which all showed sizeable drops in recorded elective admissions. For emergency (ie, non-elective) admissions, recorded impacts were smaller and seemed to affect different areas, such as the South West and West Midlands.
Our analysis found an average reduction in A&E patient volume of 7.09% across all strike days. The largest regional drops in volume were found in Yorkshire and the Humber (8.05%) and the North East (7.8%). Impacts were more limited in the East of England (5.93%) and the North West (6.24%).
We found that industrial action by junior doctors in 2016 resulted in a total of 31 651 fewer admissions, 173 462 fewer outpatient appointments and 23 895 fewer A&E attendances compared with expected volumes from similar weeks. Large effects were seen in the numbers of cancelled outpatient appointments by healthcare providers—providers cancelled a total of 294 844 appointments, a 52% increase compared with the expected volume during these periods. The most pronounced effects on NHS operations were seen during the first (12 January) and last (26–27 April) strikes. During all strike days, a total of 3209 patients died in the hospital during emergency admissions, 98 during elective admissions and 356 died in A&E. However, the numbers of recorded hospital deaths did not appear to change significantly during the strikes compared with the expected numbers for either admitted patients or A&E, which is in line with what has been seen in most other studies of striking doctors globally.6 We found no measurable effect on mortality within the dates analysed, although deaths due to poor care are likely to have an associated delay. Regional analysis showed that strikes disproportionately affected London, Yorkshire and the Humber and the East Midlands for outpatient appointments and elective admissions. Emergency admissions were most affected in the South West and West Midlands regions. A&E attendance was most affected in the North East, Yorkshire and the Humber and South Central England. The January 12 strike corresponded with a 9% drop in A&E admissions, despite industrial action not affecting emergency services. This is noteworthy as it implies many patients may have consciously avoided going to hospital during this period, perhaps due to intense media coverage of the event and explicit instructions from some providers to avoid all non-urgent hospital attendances.
Our analysis is broadly consistent with similar studies of this type. Prior work by Ruiz et al 3 has shown the effects of striking doctors on outpatient cancellations by provider. This work replicates that effect—during the strike on 26–27 April 2016, there was an increase in cancellations of 67% when compared with average figures from the surrounding weeks. As with Ruiz et al 3 and almost every previous study of this type both nationally and internationally,6 this work did not find a significant effect on mortality among either admitted or A&E patients during strike days. This could be either because there is no effect, or our study did not have enough power to demonstrate an effect, due to the small study period involved in strike days. It may be the case that during periods of industrial action, staffing priority is given to critical care, resulting in small differences in mortality but a poorer patient experience in non-vital care. This has previously been discussed by Metcalfe et al 6 in their international comparison of the impacts of industrial action by doctors.
This is a large national study that was able to analyse the majority of admissions, outpatient appointments and A&E visits during the 2016 strikes. HES has previously been shown to have reasonable accuracy at both outpatient10 and inpatient11 levels. Furthermore, the strike on 26–27 April also demonstrates the first opportunity for researchers in the UK to investigate the effects of industrial action on emergency personnel—previous strikes did not withhold emergency care. During this period, there was a drop in the number of patients attending A&E of 17 325 (almost 15%) compared with the expected volume for this time period. We found no evidence of increased mortality during the study period.
The weaknesses of this work are predominantly due to what was not investigated—for example, HES data alone do not allow the investigation of the effects of strikes on patients who did not attend A&E during this period. Furthermore, analysis focused only on the weeks when strikes occurred, which prevented the capture of lagged effects in the immediate aftermath of a strike. The design of the study ensured that no outcomes were measured at weekends—this is a limitation, partly due to the lack of measured outcomes but additionally due to the context of the debate over the existence of a ‘Weekend Effect’ in mortality. Outcomes (especially mortality) proved difficult to measure. Death counts during strike days were small, and hence lacking in statistical power, and many patients stayed in hospital for more than a single day. Importantly, other outcomes such as morbidity, direct financial costs and opportunity costs for both the NHS (through rescheduling elective operations and other procedures) and patients (taking time off work, childcare costs and so on) were uncaptured. This study also includes no qualitative element, which prevents us from capturing unrecorded outcomes of strikes, such as disappointment, inconvenience, stress and worry. Finally, the occurrence of a national bank holiday on the week of 2 May meant that patient profiles were likely to differ between strike and comparator weeks. As such, in our analysis the second comparator week for the April strike was replaced with data from the week of 9–13 May instead.
The four junior doctors’ strikes between January and April 2016 resulted in significant negative impacts on patient care as measured by hospital activity. Significant increases in outpatient appointment cancellations by hospitals were paired with decreases in admitted patients and A&E visits. The major outcome we investigated was mortality, which showed no measurable change. However, this is likely to be the least sensitive outcome for quality and safety concerns. These findings may also suggest that NHS Trusts responded effectively to the industrial action by cancelling outpatient appointments to protect higher risk services. Future work in this area should focus on how the strikes affected waiting times and similar quality outcomes. Strike-related morbidity (such as disease progression in the time between rescheduled operations/appointments) would likely be a fertile avenue for investigation. Delays will also be likely to have an associated cost burden in terms of worse patient outcomes and hence costlier treatment, which should be accounted for. Finally, it should be determined whether quality of care was negatively impacted in the period immediately following the strikes.
Review history and Supplementary material
Contributors Study design: PA, DF. Data collection: DF, PA, AB. Data analysis and interpretation: DF, AB, PA. Drafting the article: DF. Critical revision of the article: PA, AB. Final approval of version to be published: PA, AB, DF.
Funding NIHR Programme Grants for Applied Research: RDPSC 79560. The Dr Foster Unit is an academic unit in the Department of Primary Care and Public Health, within the School of Public Health, Imperial College London. The unit receives research funding from the National Institute of Health Research and Dr Foster Intelligence, an independent health service research organisation (a wholly owned subsidiary of Telstra). The Dr Foster Unit at Imperial is affiliated with the National Institute of Health Research (NIHR) Imperial Patient Safety Translational Research Centre. The NIHR Imperial Patient Safety Translational Centre is a partnership between the Imperial College Healthcare NHS Trust and Imperial College London. The Department of Primary Care and Public Health at Imperial College London is grateful for support from the NW London NIHR Collaboration for Leadership in Applied Health Research and Care (CLAHRC) and the Imperial NIHR Biomedical Research Centre.
Disclaimer “The views expressed in this article are those of the author(s) and not necessarily those of the NHS, the NIHR, or the Department of Health.”
Competing interests PA is principal investigator for the Dr Foster Unit, an academic unit in the Department of Primary Care and Public Health, within the School of Public Health, Imperial College London. The unit receives research funding from Dr Foster Intelligence, an independent health service research organisation (a wholly owned subsidiary of Telstra).
Ethics approval The principal investigator has approval from the Secretary of State and the Health Research Authority under Regulation 5 of the Health Service (Control of Patient Information) Regulations 2002 to hold confidential data and analyse them for research purposes (CAG ref 15/CAG/0005). We have approval to use them for research and measuring quality of delivery of healthcare from the London - South East Ethics Committee (REC ref 15/LO/0824).
Provenance and peer review Not commissioned; externally peer reviewed.
Data sharing statement All SAS code used in the study is available upon request from the corresponding author.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.