Article Text

Download PDFPDF

Identifying clinical features in primary care electronic health record studies: methods for codelist development
  1. Jessica Watson1,
  2. Brian D Nicholson2,
  3. Willie Hamilton3,
  4. Sarah Price3
  1. 1 Centre for Academic Primary Care, Bristol Medical School, University of Bristol, Bristol, UK
  2. 2 Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
  3. 3 University of Exeter Medical School, Exeter, UK
  1. Correspondence to Dr Jessica Watson; jessica.watson{at}


Objective Analysis of routinely collected electronic health record (EHR) data from primary care is reliant on the creation of codelists to define clinical features of interest. To improve scientific rigour, transparency and replicability, we describe and demonstrate a standardised reproducible methodology for clinical codelist development.

Design We describe a three-stage process for developing clinical codelists. First, the clear definition a priori of the clinical feature of interest using reliable clinical resources. Second, development of a list of potential codes using statistical software to comprehensively search all available codes. Third, a modified Delphi process to reach consensus between primary care practitioners on the most relevant codes, including the generation of an ‘uncertainty’ variable to allow sensitivity analysis.

Setting These methods are illustrated by developing a codelist for shortness of breath in a primary care EHR sample, including modifiable syntax for commonly used statistical software.

Participants The codelist was used to estimate the frequency of shortness of breath in a cohort of 28 216 patients aged over 18 years who received an incident diagnosis of lung cancer between 1 January 2000 and 30 November 2016 in the Clinical Practice Research Datalink (CPRD).

Results Of 78 candidate codes, 29 were excluded as inappropriate. Complete agreement was reached for 44 (90%) of the remaining codes, with partial disagreement over 5 (10%). 13 091 episodes of shortness of breath were identified in the cohort of 28 216 patients. Sensitivity analysis demonstrates that codes with the greatest uncertainty tend to be rarely used in clinical practice.

Conclusions Although initially time consuming, using a rigorous and reproducible method for codelist generation ‘future-proofs’ findings and an auditable, modifiable syntax for codelist generation enables sharing and replication of EHR studies. Published codelists should be badged by quality and report the methods of codelist generation including: definitions and justifications associated with each codelist; the syntax or search method; the number of candidate codes identified; and the categorisation of codes after Delphi review.

  • electronic health records
  • clinical coding
  • primary care
  • epidemiology

This is an Open Access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) license, which permits others to distribute, remix, adapt and build upon this work, for commercial use, provided the original work is properly cited. See:

View Full Text

Statistics from


  • Contributors WH conceived and SP enhanced the methods of codelist collation described in the paper. JW wrote the original outline of the paper. SP designed and performed the data analysis. JW, SP and BDN developed the first draft of the paper. All authors contributed to subsequent drafts and read and approved the final manuscript.

  • Funding JW (DRF-2016-09-034) and BDN (DRF-2015-08-18) are both funded by Doctoral Research Fellowships from the National Institute for Health Research Trainees Coordinating Centre. WH is part-funded by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South West Peninsula at the Royal Devon and Exeter NHS Foundation Trust.

  • Disclaimer The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement CPRD data on which the sensitivity analysis was based is held securely by University of Exeter Medical School under the CPRD data access licence (

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.