Article Text

Download PDFPDF

Adaptation of the DEMQOL-Proxy for routine use in care homes: a cross-sectional study of the reliability and validity of DEMQOL-CH
  1. Laura J Hughes1,
  2. Nicolas Farina1,
  3. Thomas E Page2,
  4. Naji Tabet1,
  5. Sube Banerjee1
  1. 1Neuroscience, Brighton and Sussex Medical School, Brighton, UK
  2. 2Psychology, University of Kent, Canterbury, UK
  1. Correspondence to Professor Sube Banerjee; s.banerjee{at}bsms.ac.uk

Abstract

Objective To investigate the routine use of a measure of quality of life (QoL) in care homes and assess its psychometric properties when used by care staff.

Design A cross-sectional two-phase study.

Setting and participants Data were collected from care staff in seven care homes in East Sussex, England.

Method Phase I: The ability of care staff from two care homes to use the DEMQOL-Proxy without interviewer administration was assessed using agreement analysis between a self-administered and interviewer-administered version of the instrument. Based on these findings, DEMQOL-Proxy was adapted into a new version, DEMQOL-CH, for use as a self-administered instrument in care homes. We assessed agreement between the new DEMQOL-CH and DEMQOL-Proxy to ensure DEMQOL-CH was used correctly. Phase II: A preliminary assessment of the psychometric properties of DEMQOL-CH when used routinely was completed in a further five care homes.

Results Phase I: Nineteen care staff from two care homes completed QoL measurements for residents. Systematic error was identified when staff self-completed the DEMQOL-Proxy without an interviewer. We modified the DEMOoL-Proxy to create DEMQOL-CH; this reduced the error, producing a version that could be used more accurately by care staff. Phase II: Eleven care staff from five care homes rated resident QoL routinely. DEMQOL-CH showed acceptable psychometric properties with satisfactory reliability and validity and a clear factor structure.

Conclusions The research presents positive preliminary data on the acceptability, feasibility and performance of routine QoL measurement in care homes using an adapted version of DEMQOL-Proxy, the DEMQOL-CH. Results provide evidence to support the concept that routine measurement of QoL may be possible in care homes. Research is needed to refine and test the methodology and instrument further and to explore the potential for benefits to residents, staff and care homes in larger and more representative populations.

  • dementia
  • education and training (see medical education and training)
  • geriatric medicine

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Strengths and limitations of this study

  • This study offers new insight into the routine measurement of quality of life in care homes.

  • The findings from agreement analysis provide useful information about the use of self-administered instruments in a care home population.

  • The psychometric properties of a self-administered quality of life instrument for use in care homes is provided.

  • Repeated measures data were collected from only a modest sample size of participants.

  • Quality of life, stress and burden in care staff were not examined.

Background

Approximately 416 000 people live in care homes in the UK, this includes a third of those with dementia.1 The quality and variability of care home care has been questioned, with calls to increase and improve the services provided to these vulnerable adults.2–4 Recognising that the outcomes of care are as important as the process of care,5 there has been a growing interest in measuring quality of life (QoL) as a means of understanding and improving care in care settings.6 The broad assessment afforded by QoL measurement is particularly relevant in this setting considering the wide impacts of dementia and other physical disorders, and the interplay between biological, psychological and social factors in the frail elderly. An increasing amount of research in care settings is therefore being carried out with QoL as a key outcome measure in evaluating interventions in care homes.7 Given that the large majority of care home residents have dementia, these studies have used dementia-specific QoL measures such as the Quality of Life in Alzheimer’s Disease (QoL-AD)8 and Dementia Quality of Life (DEMQOL).9 However, most available measures were developed and evaluated in community dwelling or mixed community and care home populations, with variable validity and reliability in care home settings.10

It has been suggested that the routine use of a QoL instrument might be beneficial in enabling staff to understand and monitor the QoL of care home residents and how it changes over time.11 Care staff could use QoL measurements to inform the content of care plans and to monitor their effect on residents under their care. Relatives of residents could be given data in addition to that on the physical health of their relatives, and care inspectors and regulators could use QoL data to make inspection findings more relevant and potentially to have greater positive impact.12 The routine measurement of QoL as a part of normal care practice could allow the monitoring of QoL at an individual resident level and at the home level by aggregating resident data. This could provide powerful data with which to understand and improve quality of care. To date, no studies have implemented or assessed routine QoL measurement in care homes. One of the reasons for this could be the lack of appropriate instruments that are usable in care homes by care staff.13

To enable routine QoL measurement in care homes, an appropriate instrument is required. Instrument questions need to work in care settings; some instruments not developed specifically for use in care home settings contain inappropriate questions, often reflecting the opportunity to perform a function rather than the ability to perform it.14 The type of instrument and its administration is also important. Due to the high prevalence of dementia in care settings6 15 16 and consequent difficulties in self-report, the use of proxy report may be needed to permit the inclusion of residents with all severities of dementia and provide consistent measurement of QoL over time. Also, instruments often require interviewer administration by a trained researcher and can be difficult to access. Few instruments have been developed for self-administration by care home staff as would be needed if such measurement were part of routine care practice.

We therefore carried out a study that aimed to assess the use of DEMQOL-Proxy17 as a self-administered QoL instrument for routine use by care home staff as a part of normal care practice. DEMQOL-Proxy is a widely used instrument for measuring the QoL of people with dementia, it has good psychometric properties and is the product of thorough development10 17 The validity and reliability of DEMQOL-Proxy has been shown to be acceptable in people with mild, moderate and severe dementia.17 DEMQOL-Proxy is freely available to use and has acceptable usability, which are important factors for routine use of QoL instruments in care homes.13 Importantly, DEMQOL-Proxy has appropriately framed questions for a care home setting. Some instruments not developed specifically for use in care home settings can contain inappropriate questions, often reflecting the opportunity to perform a function rather than the ability to perform it.14 DEMQOL-Proxy asks how people feel, regardless of whether they can still perform those functions. The study had two phases; in the first, we assessed agreement between self-administration and interviewer administration of DEMQOL-Proxy QoL, adapted the instrument to generate a care home (DEMQOL-CH) version for use by care staff, and assessed agreement between DEMQOL-CH and DEMQOL-Proxy. In the second phase, we completed a preliminary assessment of the psychometric properties of DEMQOL-CH used routinely by care home staff as a part of normal practice.

Method

Setting and sample

We recruited care staff and residents from seven care homes in East Sussex, England. Two homes were involved in the first phase; assessing agreement between self and interviewer administration and generating DEMQOL-CH. Five participated in the second phase; assessing the psychometric properties of DEMQOL-CH. All care homes provided residential care to older adults and were registered with Alzheimer’s disease or dementia as a specialist care category. Four of the five homes in phase II were nursing homes; all others provided care without nursing. All permanent staff were eligible to take part (full time or part time). All residents were eligible to take part; we did not exclude residents without suspected dementia. Presence of cognitive impairment was ascertained by using screening instruments (see below). We did not access medical records to determine if residents had a formal diagnosis of dementia. Inclusion criteria for the homes were that they provided care for older adults and were not under special measures from the Care Quality Commission, the independent regulator for health and social care services in England. Resident capacity was assessed; informed consent was obtained where possible, and those without capacity were appointed a personal or nominated consultee.

Phase I

Instruments

DEMQOL-Proxy17: DEMQOL-Proxy is a 31-item interviewer-administered proxy report instrument which measures the QoL of people with dementia. It has a two-factor structure of ‘functioning’ and ‘emotion’ organised over three sections that ask about feelings, memory and everyday life. Items are scored on a Likert scale from one to four (a lot, quite a bit, a little and not at all) with a score range of 31–124. Higher overall scores indicate better QoL.

Procedure

Agreement between self-administered and interviewer-administered DEMQOL-Proxy - —Care staff measured resident QoL using a self-administered DEMQOL-Proxy first followed by the interviewer-administered version with a time interval of approximately 6 hours between each measurement. The time interval was large enough so that staff would not repeat their previous responses but narrow enough to ensure they were measuring a similar timeframe. We rated the same residents at each time point.

Generation of DEMQOL-CH—Findings from agreement analysis (see below) suggested that staff were misinterpreting some of the questions of the instrument. This was supported by discussions with care staff when completing the DEMQOL-Proxy as an interviewer-administered instrument. For example, when a member of care staff responded to the question ‘how worried would you say the resident been about his/her memory in general?’ with ‘a lot’ the researcher asked a follow-up question such as ‘how much does that worry them?’, very often they would respond that it does not worry the resident, but that the resident has memory problems and therefore would change their response to ‘not at all’. One potential reason for this is the layout and structure of the questions in DEMQOL-Proxy. Questions are structured with a timeframe, stem and item; the timeframe and stem are written in a sentence preceding the items and responses for each section (figure 1). Two of the sections in DEMQOL-Proxy ask the respondent ‘how worried’ the person with dementia has been. It appeared that care staff might have neglected to read this first part of the stem for each question in the self-administered instrument resulting in misinterpretation of the question, leading to poor agreement between the self-administered and interviewer-administered instruments. If care staff misread the stem and item, there is a possibility that they recorded their response based on the resident’s functional ability rather than their QoL, as they would have not read the segment that asked ‘how worried’ the resident had been. To mitigate this, we restructured the layout of the questions by placing the stem (eg, how worried has the resident been about…) before each question item (figure 1) to encourage and prompt care staff to read it for every question and respond based on QoL.

Figure 1

Change to questionnaire structure of DEMQOL-Proxy to create DEMQOL-CH with emphasis on QoL aspect of questions. Extracts from A: DEMQOL-Proxy, and B: DEMQOL-CH of memory section.

Agreement between DEMQOL-CH and DEMQOL-Proxy—Care staff measured resident QoL using the DEMQOL-Proxy and DEMQOL-CH instruments. To counterbalance primacy effects, care staff were randomly allocated to one of two conditions: (1) complete DEMQOL-CH first then DEMQOL-Proxy or (2) complete DEMQOL-Proxy first then DEMQOL-CH. An interval of 6 hours between ratings was set. DEMQOL-Proxy was used in its standard format as an interviewer-administered instrument and DEMQOL-CH was self-completed by care home staff.

All care staff were trained in the use of DEMQOL-Proxy and DEMQOL-CH by the researcher. This took approximately 5 min to complete and was guided by the available user guide for interviewer administration. In addition, for the DEMQOL-CH, a new user guide appropriate for staff self-administration was created and supplied with each copy of the DEMQOL-CH instrument.

Statistical analysis

Agreement analysis—We conducted agreement analysis to assess agreement between DEMQOL-Proxy (self-administered) and DEMQOL-Proxy (interviewer administered), and between DEMQOL-Proxy and DEMQOL-CH. Paired t-tests were used to assess the relationship between the instruments. A Bland Altman plot analysis was used to assess agreement between instruments.18 A threshold of five points was chosen as the clinically accepted measurement error for changes in QoL scores estimated as half an SD of the total scale.19

Phase II

Instruments

Clinical Dementia Rating Scale (CDR)20 : The CDR is a 5-point scale used to characterise the severity of impairment in dementia assessing six domains of functional performance: memory, orientation, judgement and problem solving, community affairs, home and hobbies and personal care. The CDR is a reliable and valid tool for rating dementia severity; it is widely used in care home research and is one of the most widely used severity measures being translated into approximately 60 languages.21 22 It is scored 0.5, 1, 2 and 3; these are equivalent to minimal, mild, moderate and severe impairment.

Standardised Mini-mental State Examination (sMMSE)23 : The sMMSE is a 20-item questionnaire which assesses specific domains of functioning in older adults such as: orientation to time and place, short-term and long-term memory, registration, recall, constructional ability, language and ability to understand and follow commands. Scores range from 0 (severe dementia) to 30 (normal cognition). The sMMSE is a widely used instrument in care homes and the community designed to measure the severity of cognitive impairment.

Dementia Care Mapping (DCM) (8th ed)24 : DCM is an observational tool designed for use in communal or ‘public’ areas within care settings. A trained person, known as a mapper, observes up to five people for up to 6 hours at a time. The mapper documents activities every 5 min and assigns a behaviour category code (BCC) from a list of 23 possible codes which best reflect what the individual is doing. BCCs fit into two categories: type 1, positive behaviours such as expressive activities that have high potential for well-being and type 2, negative behaviours such as being socially withdrawn that have low potential for well-being. To each BCC, the mapper assigns a well-ill-being (WIB) score (-5,–3, −1,+1,+3,+5), –5 is lowest ill-being and +5 highest well-being. DCM has been used extensively in research studies in care homes; there is mixed support for the validity and reliability of the DCM tool25; however, it has been used successfully in a number of studies as a measure of concurrent validity with QoL instruments.26

Procedure

Assessment of the psychometric properties of DEMQOL-CH—Care staff rated resident QoL routinely using DEMQOL-CH. Staff rated approximately five residents at each time point. Staff rated the same residents more than once, and each resident was rated by more than one member of staff over the course of the study. Resident cognitive function was assessed by the researcher using the sMMSE and CDR. As validation, the researcher carried out a DCM session in each care home over a continuous 3-hour period. Each mapping session incorporated the hour before lunch as this is representative of the rest of the day.27

All staff were trained in the use of DEMQOL-CH by the researcher and provided with a copy of the DEMQOL-CH user guide.

Psychometric analysis

Acceptability and data quality—Acceptability and data quality was assessed by calculating missing data, and floor and ceiling effects. In line with the original development of DEMQOL-Proxy17 the criterion set was <5% for missing data and <10% for floor and ceiling effects.

Reliability—We assessed internal consistency of DEMQOL-CH using Cronbach’s alpha (ɑ) with a criterion of ≥0.70 considered acceptable. Test–retest reliability was assessed with intraclass correlation (ICC), using a two-way mixed-effects model of absolute agreement. We assessed inter-rater reliability using ICC with a one-way random effects model. Ratings made up to 7 days apart were included in the analysis. For test–retest and inter-rater reliability, a criterion of ≥0.75 was considered good.28

Validity—DEMQOL-Proxy was developed in mixed community and care home dwelling populations and had a two-factor structure. More recent studies in community dwelling populations have found five-factor29 and four-factor structures.30 Consequently, DEMQOL-Proxy has never been solely assessed in a care home population nor has it been completed by care staff. Changes such as these can alter the psychometric properties of instruments and require their re-evaluation.31 32 We therefore performed new exploratory factor analyses to evaluate the dimensionality of DEMQOL-CH when used in this population and with a new administration method. Suitability of the data for factor analysis was checked using Keiser–Meyer-Olkin (KMO) (>0.5) and Bartlett’s test of sphericity (p<0.05).33 All factor analyses used unweighted least squares extraction as it is more appropriate for smaller sample sizes and provides better estimates34 and direct oblimin rotation. To assess the fit of the factor model, the difference between the observed correlations, and the model-based correlations were analysed where no more than 50% of the residuals should be >0.05.35 Eigenvalues and the scree plot were used to determine the number of factors to extract. Item loadings of ≥0.40 were considered acceptable.36 Convergent validity is the extent to which a construct correlates with measures of the same or similar constructs. We assessed the convergent validity of DEMQOL-CH by examining correlations with DCM indices of mean WIB score, %WIB +3 or +5 and %BCC type 1. Based on previous research,5 it was hypothesised that moderate positive correlations (approximately 0.30–0.50) between DEMQOL-CH and DCM would be found as DEMQOL-CH is a proxy instrument and DCM is an observational tool.

Factors associated with QoL scores—The factors associated with QoL are variable throughout the literature, with studies producing conflicting findings in regards to the extent of the relationship between dementia severity and QoL. Here, we attempt to add to the literature on these factors using a new method of collecting QoL data. We carried out hierarchical regression analysis to assess which factors accounted for the most variance in predicting QoL. DEMQOL-Proxy has never multiple regression assessed the relationship between outcome and predictor variables and assessed the effect that changes to the predictor variables had on the outcome variable. The assumptions of hierarchical regression were tested prior to the analysis. A three-step hierarchical multiple regression analysis was conducted with total QoL as the dependent variable. Resident characteristics (gender and dementia severity) were entered at step 1, the variables related to time spent with residents and time working in care (time working in the care home, time working in the care sector, hours worked per day) were entered at step 2. Staff confidence in completing each DEMQOL-CH was entered at step 3.

Patient and public involvement

The development and evaluation of a QoL instrument in care homes was part of a larger project investigating the feasibility of implementing routine QoL measurement into care practice. The lead author, who has experience of working in the social care sector, led this work. We involved care staff in a qualitative study about the feasibility of measuring QoL as a part of their care practice. The findings from this study informed the rest of the project and carried forward staff views such as where they could fit the measurements into their practice and how often this could be achieved.

Results

Sample characteristics

Nineteen care staff took part in phase I of the study, and 11 took part in phase II (table 1). The majority of staff were female, White British and worked as direct care staff (ie, senior care assistants, care assistants and nurse). Time working in the care home was over 38 months, and staff worked on average 38–40 hours per week over 4 days. Twenty-eight residents had their QoL measured in phase I and 42 in phase II. Residents in this phase were predominantly female (74%). Gender was not recoded for phase I residents. Twenty-eight residents had their QoL measured in phase I and 42 in phase II. CDR ratings for residents in phase II showed one resident (2%) had no dementia, six (14%) had questionable dementia, nine (21%) had mild, eight (19%) moderate and 18 (43%) had severe dementia. Only nine residents were able or willing to complete the sMMSE, therefore, this is not included. The one resident without dementia was excluded from the following analyses as subgroup analysis cannot be carried out.

Table 1

Characteristics of care staff in phase I and phase II

Phase I

Agreement between self- administered and interviewer-administered DEMQOL-Proxy —The mean QoL score for the self-administered instrument (96.0, SD=15.3) was statistically significantly lower than the interviewer-administered instrument (100.9, SD=10.1) (t=3.82, df=91, p<0.001). Bland Altman plot analysis (figure 2A) showed a mean difference between the QoL scores of 4.9 (SD=12.2, 95% CI 2.3 to 7.4). The upper and lower limits of agreement were calculated as 28.8 (95% CI 24.5 to 33.1) and −19.1 (95% CI −14.8 to −23.3), respectively. The mean difference was just smaller than the five-point cut-off score set. However, the large limits of agreement, a line of equality outside the mean difference 95% CI and evidence of proportional bias (t (90)=−5.08, p<0.001) (figure 2) indicated that the measurement bias between the two instruments was significant. To investigate this further, the difference in means between factors (emotion and functioning) of DEMQOL-Proxy when self-administered and interviewer-administered were analysed. A Wilcoxon matched pairs signed rank test showed statistically significant differences between the function factor (z=−4.173, p<0.001) but not the emotion factor (z=−0.139, p=0.889), suggesting there was a systematic difference in the way staff reported on questions in the function factor.

Figure 2

Bland Altman plot of agreement between: a) DEMQOL-Proxy self-administered and DEMQOL-Proxy interviewer-administered, and b) DEMQOL-Proxy and DEMQOL-CH. A difference of zero would indicate perfect agreement as indicated by the black line. The mean difference between the two instruments is indicated with a red dashed line, the upper and lower 95% limits of agreement are indicated with the blue dashed lines, with 95% CIs indicated by green dotted lines. Evidence of proportional bias is shown by the regression line with 95% CI.

Derivation of DEMQOL-CH —The findings from the agreement analysis reported above and from researcher observations suggested that staff were misinterpreting some of the questions of the DEMQOL-Proxy instrument when it was self-administered. The structure of the questions contains a timeframe, stem and item. The timeframe and stem precede the item and response and are located above the section of questions. In the functioning factor questions, respondents report ‘how worried’ the resident has been. Care staff may have neglected to read this part of the question in the self-administered instrument resulting in responses based on whether or not the resident could perform or had problems with the function, leading to the poor agreement observed. Figure 1 summarises the changes made to question structure between DEMQOL-Proxy and DEMQOL-CH.

DEMQOL-CH: DEMQOL-CH is a 31-item self-administered QoL questionnaire derived by altering the structure of the stem of DEMQOL-Proxy questions to emphasise the QoL aspect of each question and make it suitable for self-administration by care home staff.

Agreement between DEMQOL-CH and DEMQOL-Proxy—A paired samples t-test showed that there was no statistically significant difference between the means of the self-administered DEMQOL-CH (M=99.7, SD=9.7) and the interview-administered DEMQOL-Proxy (M=99.3, SD=9.6), (t=−0.57, df=50, p=0.57). Bland Altman plot analysis (figure 2B) showed a mean difference between the QoL scores of 0.36 (SD=4.5, 95% CI 0.9 to −1.6). The upper and lower limits of agreement were calculated as 8.5 (95% CI 6.4 to 10.7) and −9.2 (95% CI −7.1 to −11.4), respectively. A mean difference smaller than five points, narrow limits of agreement and no evidence of proportional bias (t(50) = −0.23, p=0.819) suggested acceptable agreement between the two instruments, supporting the validity of DEMQOL-CH.

Phase II

Psychometric properties of DEMQOL-CH

Acceptability and data quality—table 2 provides descriptive data on DEMQOL-CH; there were no missing data and no floor effects. There was a small percentage of ceiling effects (0.6%). DEMQOL-CH scores ranged from 59 to 124.

Table 2

Descriptive statistics of DEMQOL-CH

Reliability—The internal consistency of DEMQOL-CH was excellent (α=0.90, 95% CI=0.88 to 0.92). Testing inter-rater reliability for ratings made up to 7 days apart the ICC was 0.40 (95% CI=0.06 to 0.65) (n=31). Test–retest reliability had an ICC of 0.72 (95% CI=0.54 to 0.84) for ratings made up to 4 weeks apart.

Validity—The KMO measure for sampling adequacy was 0.84 indicating that the sample was suitable for factor analysis. Bartlett’s test of sphericity was significant (p<0.001). The scree plot suggested either a five-factor or a four-factor structure. After examining each solution, the four-factor solution was chosen as it provided a clearer factor structure. One item (‘keeping him/herself clean’) did not load onto any factor above 0.40 but loaded onto factor one with a loading of 0.386. Factor loadings and Cronbach’s α for each factor after rotation are shown in table 3. Loadings of <0.3 are not included. Comparing DEMQOL-CH with DCM, a small statistically significant positive correlation was found between DEMQOL-CH and the percentage of observed ‘good’ behaviours (%BCC type 1) (rs=0.34, n=35 p=0.024, one tailed). We observed no other significant correlations. No statistically significant correlations were found for those with mild dementia. For those with moderate dementia a strong positive correlation was found between DEMQOL-CH and mean WIB score (rs=0.67, n=13 p=0.006, one tailed) and a moderate positive correlation between DEMQOL-CH and %BBC type 1 (rs=0.55, n=13 p=0.026, one tailed).

Table 3

Rotated factor loadings for DEMQOL-CH (n = 154)

Factors associated with QoL scores—The hierarchical multiple regression showed that at step 1, resident gender and resident dementia severity did not contribute significantly to the regression model (F (4,112)=1.97, p=0.103), accounting for just 3.2% of the variation in QoL. The introduction of the time variables explained an extra 11.1%. This change in R2 was statistically significant (F (7,109)=3.76, p=0.001). Finally, adding staff rating confidence to the regression model explained an additional 10.3% of the variance in QoL, and this change in R2 was also significant (F (8,108)=5.74, p<0.001). See table 4 for a summary of the hierarchical linear regression analysis results. When all 10 predictor variables were included in step 3 of the regression model, there were three significant predictors of QoL; these were: time working in the care home (p<0.05), hours worked per day and confidence (both p<0.001).

Table 4

Summary of hierarchical regression analysis for variables predicting QoL using DEMQOL-CH (n=89)

Discussion

There are three main findings from this study. First, that DEMQOL-Proxy, and by extension other instruments measuring QoL designed to be interviewer administered, cannot be completed by self-report by care workers without significant error. Second, we found that it was possible to adapt DEMQOL-Proxy into DEMQOL-CH, a version of the former that is specifically altered to work as a self-report instrument for use by care staff in care homes. Third, these preliminary results suggest that the psychometric properties of DEMQOL-CH may be acceptable for its use in this setting.

Results from the agreement analysis between self-administered and interviewer-administered versions of DEMQOL-Proxy show that there are important limitations in using DEMQOL-Proxy as a self-administered instrument by care staff. Without an interviewer, care staff appeared to focus on the functional abilities of residents instead of the impact of the limitation on well-being. This appeared due to the lack of interviewer direction and the structure of the questions causing systematic error in the QoL score. It is unsurprising that care staff focus on resident functional ability when one of the primary roles of care staff is to monitor and report resident functional abilities and changes. In this respect, care staff may be different from family carers who have been reported to be able to self-complete DEMQOL-Proxy without such error.37 Although there is conflicting evidence,38 higher functional impairment has been reported to be more closely associated with proxy scores than with self-reports.11 12 39–41 We found that there was miscoding, despite staff being given training and the instruments’ standardised instructions. Interviewer administration reduces the likelihood of such error as the interviewer can prompt the respondent and ensure they have understood the question. However, it has also been suggested that interviewer-administered QoL instruments generally yield higher QoL scores than self-administered instruments because participants may feel the need to please the researcher.42–44 While interviewer administration is possible in research projects where the funding for interviewers is available, it is not likely to be an option in routine care practice given the resource constraints in the care sector. We therefore modified the framing of DEMQOL-Proxy questions to enable their use in this setting by self-report.

Our findings highlight the need to assess agreement when the mode of administration of an instrument is changed. In a study of a self-administered version of DEMQOL-Proxy, Hendriks et al found family carers could complete the instrument reliably without an interviewer to administer it.37 However, clinic staff were available to respond to any questions from proxy respondents, and those staff needed to understand the purpose of the instrument to enable it to be completed correctly. The finding here of good reliability but poor agreement between self-administered and interviewer-administered DEMQOL-Proxy scores supports the need for such broad assessment of agreement.

The preliminary psychometric analyses of DEMQOL-CH reported here show overall validity and reliability comparable to other available instruments. DEMQOL-CH has the added advantage of simplicity of use by care staff, promoting the possibility of routine use in care settings. DEMQOL-CH showed excellent internal consistency and acceptable test–retest reliability; however, inter-rater reliability was below the set criterion. This requires further research; it may be a function of the small sample size, variability in staff characteristics or individual change over time. Although results from the regression analysis found that staff occupational characteristics accounted for only a small proportion of the variance, the inter-rater reliability finding could represent other unmeasured characteristics of staff. Studies have reported that care staff proxy ratings of QoL can be affected by factors other than sociodemographic or occupational characteristics such as stress45 and staff attitudes towards people with dementia.46 These were not assessed in this study. Good inter-rater reliability may be of particular value in long-term care given the high staff turnover.

Exploratory factor analysis found a four-factor structure of functioning, positive emotion, negative emotion and engagement. These factors differ from the findings in the DEMQOL-Proxy development papers17 but are similar to recent larger studies assessing DEMQOL-Proxy factor structure.29 30 This observed difference may also be driven by this being a care home population. A hierarchical linear regression analysis identified that the only significant predictors of QoL, as measured by DEMQOL-CH, were: the amount of time staff had worked in the care home; the number of hours staff worked per day and staff confidence. Lower QoL was associated with the amount of time that staff worked in the care home and number of hours worked per day, whereas higher QoL was associated with staff confidence. Dementia severity was not a predictor of QoL, adding to the literature that dementia severity is not a good proxy for QoL in dementia. We also found that the more hours staff worked per day and the longer staff had worked in the care home were negatively associated with QoL. So higher staff and resident contact may result in lower QoL ratings by staff. This has been found in previous studies where more stable staff groups, resident assignment and number of days worked were related to lower QoL.46 47 The regression analysis carried out in this study found that increased staff confidence was associated with higher QoL. This may have been their confidence in rating the QoL instrument or their confidence in rating the resident. Staff confidence has been found to be an important factor in caring for people with dementia48 and is also associated with higher QoL scores using the QoL-AD.45

Although the findings of this study must be considered preliminary, they provide initial evidence that routine QoL measurement in care homes is feasible. Care staff were able to collect repeated regular QoL measurements for a large number of residents during the study. The findings support the need for further investigation of routine QoL measurement to assess in more detail the implementation and acceptability of DEMQOL-CH in terms of practical application and ease of use and also to explore the potential uses of routine measurement on both QoL and quality of care.

Strengths and limitations of this study

A limitation of the study is the sample size used for the psychometric evaluation of DEMQOL-CH. This was smaller than is normally used for definitive psychometric analysis. There were a sufficient number of observations to perform the analyses; however, these were carried out on repeated measurements. Therefore, the findings need to be interpreted with caution as the lack of independence in the observations may have affected the findings presented. Future studies need to evaluate the psychometric properties of DEMQOL-CH using independent data in large representative samples. The findings reported here should be considered preliminary. They are however encouraging and lay the foundations for and inform the content of further research. In addition, future research also needs to consider assessing other aspects of DEMQOL-CH such as discriminant validity and responsiveness to provide a more comprehensive assessment of its psychometric properties. Another limitation is that this research did not assess staff factors such as their QoL, burden and attitudes. This was omitted to reduce staff burden since the main aim of the study was to develop a way to implement routine QoL measurement into care practice. Assessing the effects of staff characteristics on QoL ratings should be a focus in future studies since a proxy’s own QoL,49 50 burden,47 51 burnout and satisfaction with life47 can affect their proxy ratings of the QoL of the person with dementia. A final limitation of the study is that the care homes included in the study are not likely provide a representative picture of homes in England more generally; for example, there was an over-representation of nursing homes (four homes—57%) included in the study in comparison to national averages (11%).52 However, it is encouraging that the characteristics of care staff participants are in line with the wider social care workforce as reported in the Skills for Care National Minimum Data Set, the leading source for adult social care workforce information.52 53 Staff age, gender, ethnicity, time working in the care sector and time working in the care home are similar to national returns.

Conclusion

This paper is the first to provide a preliminary demonstration of the potential for the modification of an existing instrument to work in care homes. It provides a description and a preliminary validation of a potentially promising instrument and approach that can be used in future care home research and practice. The method and results used in this study illustrate that QoL instruments may need modification when they are used for self-completion of proxy reports by care home staff in care homes. Future use of DEMQOL-CH should assess psychometric properties in a larger sample of more representative care homes.

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.
  24. 24.
  25. 25.
  26. 26.
  27. 27.
  28. 28.
  29. 29.
  30. 30.
  31. 31.
  32. 32.
  33. 33.
  34. 34.
  35. 35.
  36. 36.
  37. 37.
  38. 38.
  39. 39.
  40. 40.
  41. 41.
  42. 42.
  43. 43.
  44. 44.
  45. 45.
  46. 46.
  47. 47.
  48. 48.
  49. 49.
  50. 50.
  51. 51.
  52. 52.
  53. 53.

Footnotes

  • Contributors LH, NF and SB were all involved in the design of the study. LH collected and analysed the data and wrote the first draft of the manuscript. TEP assisted with the data analysis. All authors provided input and amendments to the manuscript.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Ethics approval Ethical approval was obtained from the Health Research Authority (Social Care Research Ethics Committee) in August 2015.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data availability statement Data are available upon reasonable request.