Short communicationMeasuring quality of patient information documents with an expanded EQIP scale
Introduction
The provision of high quality information is a legal responsibility of healthcare institutions and professionals in many countries [1], [2]. According to international guidelines [3], [4], [5], [6], [7], [8], [9], three aspects of patient information documents that should be assessed are content, structure, and identification data.
Content encompasses descriptions of the illness, all treatment options with detailed consequences, a list of sources of information and support, a space for questions, and contact information [7], [8], [9], [10]. In terms of structure, information should be balanced, evidence-based and referenced, easily understandable, relevant to the target population, regularly updated, hierarchically displayed and illustrated [7], [8], [9], [10]. The date of document issue, the names of entities responsible for editing and financing should be specified. Patients should be involved (and acknowledged) in determining document acceptability and relevance [7], [8], [9], [10].
Topic-specific [11], [14] or generic [15], [16], [17], [18], [19], [20], [21] tools have been proposed to evaluate the quality of patient information. The ensuring quality information for patients (EQIP) instrument aims to “assess quality of patient information, applicable to all information types, and prescribe the required action” [21]. Several EQIP criteria coincide with those of the British Medical Association (BMA) patient information award appraisal form [19]. Both proved useful in surveys of patient information leaflets [13], [21]. However, other criteria were recently added to evaluate patient information [7], [8], [9], [12].
In this study we aimed to (1) expand EQIP with criteria derived from a recent literature review; (2) restructure the expanded tool according to the three dimensions of content, structure and identification data; (3) use the new tool to assess the quality of information documents in a large university hospital.
Section snippets
Background
The study was conducted at Geneva University Hospitals (Switzerland). Hospital activity amounts to about 48,000 admissions and 785,000 hospitalisation days annually, in 2200 beds. The hospital had no official guidelines regarding patient information documents.
Document selection
We gathered 243 currently used documents, of which 162 met the selection criteria of describing a medical intervention with direct interaction between a patient and a health professional. Multiple identically structured documents
Documents
The 73 documents informed about an examination or diagnostic test (12), a medical treatment (7), an invasive but not surgical procedure (12), a surgical procedure (33), anaesthesia (4) and other topics (5). They covered the fields of anaesthesiology (5), pharmacology (1), surgery (14), gynaecology (11), obstetrics (6), internal medicine (8), neurology and neurosurgery (11), ophthalmology (12), otorhinolaryngology (1), pediatrics (1), radiology (2), and nosocomial infection prevention (1).
Discussion
The expended version of EQIP (EQIP36) showed good inter-rater reliability, with κ coefficients generally higher (mean κ = 0.84) than those of the original EQIP tool (mean κ = 0.60) [21]. This might be attributable to time spent adjusting assessment rules in the preliminary evaluation phase on 25 documents. Two criteria (“respectful tone” and “information presented in a logical order”) with low κ coefficients require further improvement. The subjective nature of these criteria may explain the low
Acknowledgements
We wish to acknowledge and thank B. Moult, L.S. Franck, H. Brady, authors of the first EQIP paper, who kindly provided us with details of the EQIP scale and manual.
Competing interests: none.
Financial support was generously provided by Geneva University Hospitals, as part of the Quality of Care fund (competitive review). The funding agreement ensured the authors’ complete independence in designing the study, interpreting the data, writing and publishing the report.
References (27)
- et al.
Evaluating the reliability and validity of three tools to assess the quality of health information on the Internet
Patient Educ Couns
(2003) - et al.
What do patients with prostate or breast cancer want from an Internet site? A qualitative study of information needs
Patient Educ Couns
(2004) - et al.
A history and theory of informed consent
(1986) - et al.
Informed consent for medical treatment and research: a review
Oncologist
(2005) - et al.
Informing patients. An assessment of the quality of patient information materials
(1998) - General Medical Council. Seeking patients’ consent: the ethical considerations. London: General Medical Council,...
- National Health and Medical Research Council. How to prepare and present evidence-based information for consumers of...
- National Health and Medical Research Council. Communicating with patients: advice for medical practitioners,...
- Haute Autorité de Santé. Guide méthodologique: élaboration d’un document écrit d’information à l’intention des patients...
- International Patient Decision Aid Standards collaboration. Background document,...
A PIL for every ill? Patient information leaflets (PILs): a review of past, present and future use
Fam Pract
How risks of breast cancer and benefits of screening are communicated to women: analysis of 58 pamphlets
Br Med J
Cited by (63)
Letter Regarding: Assessing the Readability of Online Education Resources Related to Neophallus Reconstruction
2024, Journal of Surgical ResearchQuality Assessment of Elective Abdominal Aortic Aneurysm Repair Patient Information on the Internet Using the Modified Ensuring Quality Information for Patients Tool
2024, European Journal of Vascular and Endovascular SurgeryAnswering head and neck cancer questions: An assessment of ChatGPT responses
2024, American Journal of Otolaryngology - Head and Neck Medicine and Surgery