Article Text

Original research
Infographic summaries for clinical practice guidelines: results from user testing of the BMJ Rapid Recommendations in primary care
  1. Pieter Van Bostraeten1,
  2. Bert Aertgeerts1,
  3. Geertruida E Bekkering1,
  4. Nicolas Delvaux1,
  5. Charlotte Dijckmans1,
  6. Elise Ostyn1,
  7. Willem Soontjens1,
  8. Wout Matthysen1,
  9. Anna Haers1,
  10. Matisse Vanheeswyck1,
  11. Alexander Vandekendelaere1,
  12. Niels Van der Auwera1,
  13. Noémie Schenk1,
  14. Will Stahl-Timmins2,
  15. Thomas Agoritsas3,4,
  16. Mieke Vermandere1
  1. 1Department of Public Health and Primary Care, KU Leuven, Leuven, Flanders, Belgium
  2. 2Data Graphics Designer, The BMJ, London, UK
  3. 3Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada
  4. 4Division General Internal Medicine, Department of Medicine, University Hospitals of Geneva, Geneva, Switzerland
  1. Correspondence to Dr Pieter Van Bostraeten; pieter.vanbostraeten{at}kuleuven.be

Abstract

Objectives Infographics have the potential to enhance knowledge translation and implementation of clinical practice guidelines at the point of care. They can provide a synoptic view of recommendations, their rationale and supporting evidence. They should be understandable and easy to use. Little evaluation of these infographics regarding user experience has taken place. We explored general practitioners’ experiences with five selected BMJ Rapid Recommendation infographics suited for primary care.

Methods An iterative, qualitative user testing design was applied on two consecutive groups of 10 general practitioners for five selected infographics. The physicians used the infographics before clinical encounters and we performed hybrid think-aloud interviews afterwards. 20 interviews were analysed using the Qualitative Analysis Guide of Leuven.

Results Many clinicians reported that the infographics were simple and rewarding to use, time-efficient and easy to understand. They were perceived as innovative and their knowledge basis as trustworthy and supportive for decision-making. The interactive, expandable format was preferred over a static version as general practitioners focused mainly on the core message. Rapid access through the electronic health record was highly desirable. The main issues were about the use of complex scales and terminology. Understanding terminology related to evidence appraisal as well as the interpretation of statistics and unfamiliar scales remained difficult, despite the infographics.

Conclusions General practitioners perceive infographics as useful tools for guideline translation and implementation in primary care. They offer information in an enjoyable and user friendly format and are used mainly for rapid, tailored and just in time information retrieval. We recommend future infographic producers to provide information as concise as possible, carefully define the core message and explore ways to enhance the understandability of statistics and difficult concepts related to evidence appraisal.

Trial registration number MP011977.

  • information technology
  • protocols & guidelines
  • decision making
  • quality in health care
  • medical education & training
  • primary care

Data availability statement

Data are available upon reasonable request. The data sets used and/or analysed during the current study are available from the corresponding author on reasonable request.

http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Strengths and limitations of this study

  • By adopting a hybrid approach, combining think aloud interview techniques with a previously set interview guide, we balanced capturing immediate reactions and flow of thought with affording participants the opportunity to reflect more thoroughly, thereby facilitating a more nuanced understanding of their experiences.

  • The infographics were used and evaluated in a natural environment, making the findings more applicable to a real life setting.

  • The infographics were evaluated with a specific target group, making the findings of this study user-centred and directly applicable to infographic development for general practitioners (GPs).

  • Experiences and opinions of Dutch-speaking, Belgian GPs may not be generalisable to other professions, cultures or healthcare systems.

  • We did not evaluate effectiveness on knowledge retention or impact on guideline adherence or health outcomes, nor did we compare the infographics to different formats.

Background

The body of evidence in healthcare is rapidly growing and becomes impossible to manage individually by each healthcare provider. There is a constant need for integration of evidence-based care in daily practice.1 Nevertheless, an important gap remains between current research findings and what is actually implemented in practice.2–5 Clinical practice guidelines (CPG) have been trying to fill this gap, particularly when their process is trustworthy.6 7 Adherence to CPGs shows promising results on patient outcomes and healthcare costs.8–11 Implementation of these guidelines, however, is lacking due to many identified barriers.12–14 Lack of time, complexity of the guideline and limitations in the applicability to the individual patient are recurring issues.15–17

To enhance the translation and implementation of evidence, formats such as infographics have been proposed.17 Infographics use visuals such as charts, icons and illustrations, to convey information and data with a minimum of text.18 Through visualisation, infographics have the potential to convey health statistics in a transparent and understandable way, as it is known that statistical illiteracy is not only common in patients but also in physicians, resulting in serious health consequences.19 By presenting information in a visual manner, they are believed to result in superior understanding and knowledge retention by decreasing the cognitive load required by readers.18 20 21 This might contribute to the need for fast information retrieval in current practice where information is abundant.22–24 Infographics are also supposed to increase dissemination and readership of research among practitioners.25 26

Infographics have been perceived positively by physicians. They find infographics easy, efficient and enjoyable to read, user friendly, informative and useful.27–29 Physicians prefer certain infographic summaries over text-only summaries30 and online abstracts,29 and perceive less cognitive load when using them.30 While some physicians have claimed that infographics are more likely to support long-term retention of knowledge,29 study results are not undivided regarding this matter. In fact, only limited research has been conducted exploring the effectiveness of infographics and is mainly focused on patients.31 32 One comparison with text-only summaries could not find a difference in knowledge retention after 4 weeks in emergency physicians, with retention being poor in both formats.30 Also immediately after having studied an infographic, physicians had no difference in retained knowledge when compared with scientific abstracts or plain language summaries.28 Some authors even suggest that infographics might be harmful as information is not being read in depth and results might be presented inaccurately or oversimplified.25 Even though it is difficult to conclude on effectiveness of infographics in general due to their heterogeneity in format and the varying outcome measures,31 these findings generate concern, especially since infographics are costly and time consuming to produce.22

It is clear that there is a need to explore how, when and in what circumstances infographic summaries can be useful to physicians. This especially in primary care, where guideline infographics contain great potential as there is a wide variety in healthcare needs while time to keep up is scarce.

Recently, The BMJ started providing interactive infographics for each of their published Rapid Recommendations (also known as ‘RapidRecs’).12 The RapidRecs are an international project led by the MAGIC Evidence Ecosystem Foundation (www.magicevidence.org), since 2016, in collaboration with The British Medical Journal (BMJ).12 Their aims are to create and disseminate a new generation of trustworthy, timely and actionable recommendations on the basis of new practice changing evidence, as well as complex ignored evidence.33 The RapidRecs follow GRADE methodology and standards (a systematic approach to rating certainty of evidence in evidence syntheses),34 summarise the whole body of evidence in one or more systematic reviews and involve panels including experts, general practitioners and patients. They also digitally structure the guideline in the MAGICapp, which is an authoring and publication software for evidence summaries, guidelines and patient decision aids. Using digitally structured data in the MAGICapp, MAGIC and the BMJ have cocreated their guideline infographics, using skills from information designers, editors, clinicians and experts in evidence-based medicine.

This study aims to evaluate the experiences of general practitioners (GPs) when using a selection of the BMJ Rapid Recommendations infographics suited for primary care. Through user testing, we try to find out how BMJ Rapid Recommendation infographics are used by GPs and how they may support translation and implementation of evidence. Based on our findings, we aim to provide recommendations for future development of infographic summaries for CPGs that can be used by GPs.

Methods

Study design

We applied an iterative qualitative user testing design to evaluate user experiences of GPs using the BMJ Rapid Recommendations infographics in real clinical practice (figure 1). GPs used five RapidRecs infographics that were translated from English to Dutch as evidence-based background information for clinical encounters. After a period of 2–4 months, we conducted interviews, using a hybrid think-aloud method, with the GPs. Small refinements were made to the infographics based on the first round of user testing (see below). The refined infographics were then used by a different group of GPs for the same period, followed by a second round of interviews. This second group of GPs used the infographics also as support for linked patient decision aids (PDAs), developed within the MAGICapp. User testing of the PDAs has taken place in a parallel study.35

Figure 1

Iterative qualitative user testing design.

Intervention

Five out of 20 BMJ RapidRecs were chosen based on their relevance for general practice during a consensus meeting with the research team. The topics where the following: thyroid hormones treatment for subclinical hypothyroidism,36 prostate cancer screening,37 antibiotics for uncomplicated skin abscesses,38 corticosteroids for treatment of sore throat39 and arthroscopic surgery for degenerative knee arthritis and meniscal tears.40 We performed a forward–backward translation, where five GP-trainees translated the infographics to Dutch followed by the backward translation of an independent native English speaker to check for translation issues. This forward–backward method was performed preliminary to both rounds of user testing. In the first user testing round, we did not have access to the original BMJ templates. Hence, we had to recreate the infographics ourselves in Dutch, preserving the original layout as much as possible. We made the Dutch infographics available in both a digital static pdf version and a paper version and we encouraged the GPs to also explore the publicly available interactive versions, which were not translated. The interactive versions offered enhanced functionality, including the ability to collapse and expand more detailed information in certain sections and to hover over specific terms for explanatory text boxes. An example of one interactive infographic can be accessed through this link. We did not investigate how many GPs actually explored these or how much they used them. After the first round, small refinements were made based on the results of the interviews. These refinements were mainly related to small graphical or translational errors we discovered during the first round of evaluation, that could be resolved by us being able to use the real, static BMJ templates to generate translated versions. The graphical errors related to issues such as lower resolution of images or outlines that were not as they should be. We also emphasised that GPs could use the online interactive formats if they were comfortable with the English language. In the second round, the refined static pdf-formats were also available through the ‘evidence linker’, a tool integrated in the electronic health record (EHR) that provides direct online access to clinical guidelines connected to certain coded diagnoses, facilitating evidence-based care.41 We provide an example of one original infographic (figure 2). All original, translated and refined infographics are provided as supplementary materials (see online supplemental materials 1–3, respectively).

Figure 2

Example of an original infographic: ‘Arthroscopic surgery for degenerative knee arthritis and meniscal tears’. This material has been reproduced with permission from BMJ. BMJ, British Medical Journal.

Participants, recruitment and setting

The setting of this study was primary healthcare in the Dutch-speaking part of Belgium (Flanders) and was performed by 10 GP-trainees as part of their 3-year postgraduate programme thesis. The GP-trainees invited all their 21 GP-trainers to participate. The invitation was done by phone or in person with an information letter. We aimed to recruit at least 10 GPs in each round striving for a representative sample size with a heterogeneity in gender, age and geographical spread. Eventually 20 out of 21 eligible GP-trainers were enrolled. User testing took place in the office of the participating GP. Round one of the user testing lasted from December 2019 to January 2020, and round two from October 2020 to January 2021.

Data collection

Prior to each round, the supervising team (BA, MVe, TA and ND) instructed the GPs on how to use the translated infographics in daily practice through an online training of approximately 1.5 hours. The training course consisted of an explanation of the study design and a short introduction to the concept of infographics, how they are made and how to use them. It was made clear that they are guideline summaries that are meant to be used by physicians to keep up to date and are not meant as decision aids for patients. By providing the course, GPs were able to start using them immediately and we could focus mainly on experiences in daily practice. After each test period, user experience of the GP-trainers was collected through an interview. For this interview, we adopted a hybrid methodology that merged aspects of the traditional think-aloud method with the usage of an interview guide. The think-aloud method enables participants to verbalise thoughts that would otherwise often remain silent.42 43 We asked the GPs to express their thoughts when going through the infographics while preparing for fictional clinical scenarios (see online supplemental material 4), acting as if it was a real life situation. Typically when using the think-aloud method, researchers refrain from asking prepared questions. Recognising the inherent limitations of solely relying on the think-aloud method, however, we decided to nevertheless introduce an interview guide as well (see online supplemental material 5) to delve deeper into aspects of experience and usability that might not naturally surface through real-time verbalisation alone. The interview guide prompted participants to discuss various usability dimensions and provide context-rich insights into their interactions with the tool. Interviewers were trained on this hybrid technique prior to the interviews by the supervisory team. By adopting this approach, we aimed to balance capturing immediate reactions with affording participants the opportunity to reflect more thoroughly, thereby facilitating a more nuanced understanding of their experiences. We sought to enhance the breadth and depth of data collected, ultimately contributing to a more robust and multifaceted analysis of GPs’ tool usage experiences.

In the first round, each interview focused on two different infographics, the one translated by the GP-trainee conducting the interview and a second one assigned by consensus. By limiting the number of infographics discussed, we were able to explore the experiences of the GPs in more detail. User testing in round two involved the refined infographics, where refinement was based on insights from round one. As opposed to round one, the clinical encounters were observed by the GP trainees and they were used in the interview that took place within days after completing at least three consultations. All interviews were audio-recorded and transcribed verbatim preliminary to data analysis. Out of this exercise, qualitative data regarding the user’s perspective was obtained and analysed.

Data processing

Audio-recorded interviews were transcribed verbatim. Transcriptions were made anonymously and references to the identity of the interviewed GP were avoided. Certain characteristics (age, gender, type of practice, etc) were mentioned on the transcriptions, as they were believed to contribute to the quality of the analysis later on.

Data analysis

Analysis of the transcripts was based on the Qualitative Analysis Guide of Leuven.44 This guide, containing 10 main stages, outlines the process of coding qualitative interview data through review, discussion and convention. During this process, each comment was coded using a three-layered structure. First, we took sentiment into account by labelling through connotation (ie, positive comments, minor and major frustrations, show stoppers, suggestions and ways of use). Second, all notes were classified into overall themes which were created inductively through collaboration of all team members. The third layer involved six different categories that were deductively created by means of the Morville’s honeycomb model: usability, usefulness, desirability, findability, accessibility and credibility (table 1, see also online supplemental material 6).45 46 We used the online software programme ‘AtlasTi’ for coding the three layers.47

Table 1

Morville’s facets of user experience—definitions (adapted from46 65)

Each transcript was analysed and coded by two different team members than those who completed the interview to limit the impact of individual perspective. Subsequently, the coded fragments were pooled into overall concepts through constant iterative comparison, discussion and consensus among all team members to enhance reliability of the resulting findings.

Patient and public involvement

No patients were involved in this study as we aimed to evaluate only the user experiences of GPs using already developed infographic guidelines. We hence saw no added value in patient involvement for the design, conduct, reporting or dissemination plans of our research.

Results

Overview of the user experiences

We performed 10 think-aloud sessions in the first group and 10 more in the second group, interviewing 10 GPs each round (table 2).

Table 2

Characteristics of participating GPs

In both rounds, most comments and suggestions from the GPs concerned usability, usefulness and desirability of elements in the infographics. Overall there were more positive experiences than negative ones, and comments tended to be relatively more positive in the second round compared with the first round, probably due to the small refinements we made (see Methods), as well as the access through the EHR which was suggested many times during the first round of user testing.

We discuss our findings of the two rounds in each of the facets of the honeycomb with illustrative quotes from the interviews below. A summary of the most important results can be found in table 3.

Table 3

Summary of main findings

Usability

Many GPs reported that the tool was simple to use and time-efficient. Retrieving information required little effort once familiarised with the tool. However, some infographics were perceived as too confusing and complicated (eg, prostate-specific antigen (PSA) screening and subclinical hypothyroidism). Especially in the first round of our study, different physicians reported the design as too crowded with too much text. They felt reducing the amount of information may help highlight the core message. These observations were made in round two as well but to a lesser extent, acknowledging that in round two, most of the GPs found their way to the interactive, English format where further details could be retrieved, but only if required.

At a glance, I have the information I want to know. It is clear and concise. (round one, 62-year old man, rural duo practice)

I only looked at the printed version. Uhm … gosh … pff … in itself it … In itself I think there is way too much on one page. It’s unclear. The real core messages, they (emphasis) must pop out. For me, they can be bigger and preferably more to the point. (round one, 70-year old man, rural duo practice)

Many GPs said the infographics were easy to understand. Some terminology was not well understood by the GPs in the first group. Examples were ‘values and preferences’ and ‘resourcing’. After optimising the Dutch translation, comments on understandability of terms were not recited in round two. A couple of GPs noted that it was not very clear how to interpret the scales used in the infographics as they were not used in daily practice.

Erm ‘mean score’, yes. I suspect it’s also in 1000 patients… Erm… Or is it in percentages? (round two, 48-year old woman, urban duo practice)

Usefulness

Many of the physicians perceived the topics of the infographics as innovative and rewarding to meeting one’s information needs. They acknowledged their potential for the use in primary care as it supported them in the shared decision-making process afterwards with their patients. Development of new infographics was supported and seen as bringing added value to the guideline content. Although the infographic was not designed primarily to support direct shared decision-making—there were specific decision aids for that—GPs perceived the infographic may help for some, yet not all clinical topics. Several GPs claimed patient profile (eg, level of education) and context to be of particular importance when attempting to discuss with patients.

To seek confirmation for your own choice and then at the same time perhaps be able to persuade your patient with a few very specific criteria that might uhm… help to convince the patient a little more to do or not do something. (round two, 34-year old man, urban solo practice)

Some physicians found the infographics not to be adapted to local guidelines (eg, recommended regimen of antibiotics where atypical in their context), which clouded their judgement in choosing a particular treatment. Physicians noted they would be less eager to use recommendations that were not in line with their current practice. One physician did not even consider reading the infographic because of the latter.

But for the time being, it’s not changing practice, no. (round two, 48-year old woman, urban duo practice)

The GPs felt they were supported in their knowledge of the topic as the infographics displayed the most relevant information. Only some suggested adding more information to certain infographics. This is in contrast with the many statements about the extra information in the summary of findings tables that was perceived as overwhelming for the majority of the physicians. It was therefore often categorised as less important and unclear. The physicians noted that this information can be useful when you decide to dive further into it.

With some patients we don’t even make an incision, yet do start the antibiotics, so that’s also a possible option which wasn’t actually there. (about the infographic ‘antibiotics for skin abscesses’, round two, 62-year old woman, rural duo practice)

Below is the underlying evidence … Yes, I think that’s especially… It’s useful indeed if you want to delve deeper. (round two, 32-year old man, urban group practice)

Different clinicians perceived the section ‘population’ as clearly defined, but others thought it was too heterogeneous in some of the infographics. This may reflect their need for more recommendations or evidence summaries stratified to each type of patient (assuming the body of evidence allows it). For example, one clinician commented that the patient characteristics were not fully considered.

The age of the patient, the profession of the patient. Those are all things that matter. The fact that it does not mention them… It doesn’t take them into account. (round one, 70-year old man, rural duo practice)

Desirability

In general, clinicians responded positively to the overall layout. Opinions regarding choice of colour, however, were mixed. Most seemed fine with the use of colours, even describing them as clear and appealing. Others found it too great a variety, with some even distracted by the degree and type of colours that were displayed. They preferred a more straightforward and contrasting colour scheme, as they struggled to infer meaning from the chosen colours.

First and foremost I thought the part for the doctors was very well explained, very pleasant. Nicely drawn with all those colors, very attractive. (round two, 62-year old woman, rural duo practice)

I have to say I don’t understand the color code immediately. (round one, 42-year old woman, urban duo practice)

Some GPs expressed minor frustrations with the lack of uniformity in layout between the different infographics, while one GP was glad that the bar displaying the recommendation had the same layout throughout the different infographics.

Here there’s recommendations, here we have comparison and here there’s recommendations with quotation marks. (Referencing the headings for the recommendations of knee arthroscopy, PSA screening and thyroid hormones, respectively) (round two, 45-year old man, rural group practice)

The physicians thought the sequence of the different components, namely ‘population’, ‘interventions compared’ and ‘recommendation’ arranged, respectively, was very logical.

Findability

Both a printed version as a link to the online static version of the infographics were provided to the GPs. Therefore, it was difficult to evaluate the experience of them actually searching for and finding a given infographic.

When asked if GPs could find these infographics again without them being given by us in the future, some GPs stated they would not be able to. As it is hard for GPs to keep up to date on all that’s new, some indicated they preferred the guidelines to be brought to them rather than having to search for them single-handedly. Some of the older GPs preferred to have a printed version in their drawer to be able to find it more easily. Repeatedly, suggestions were made to integrate the infographics in the EHR through the evidence linker for instant access. In the second round, where access through the EHR was provided, all but one GP considered the evidence linker as an added value and even a necessity to reach this tool.

And also through the evidence linker in the EHR? Well, that would be a big added value. If it is approved and supported, I think it would otherwise vanish into nothing. Well, as a young GP, I get in touch with this through you, but otherwise I’m not going to search this on Google, you know. We also don’t have the time as GPs to seek out every guideline and check if it’s correct. That should actually be done by scientists who want to sacrifice their time for this, you know. (round one, 45-year old man, rural group practice)

In addition, some physicians proposed to be notified when new infographics would arise, for example, through the EHR. One GP proposed to use ‘recent updates’ on the website of BMJ Rapid Recommendations. Another GP proposed the use of a mobile application to be kept up to date.

So that would be nice, possibly through the EHR, that there would be a possibility to be informed about ‘this is an available rapid recommendation that is usable for primary care’. (round one, 60-year old woman, urban group practice)

Some GPs experienced difficulties in scrolling through the long list of existing infographics, describing the process as time-consuming. One GP proposed to range the infographics alphabetically and by discipline.

but with some search terms I do get a discouraging amount of lines where you really have to search as to what actually applies here. (round one, 60-year old woman, urban group practice)

Accessibility

The main comments were focused on the printed design used in the first round. Although the infographics were designed to be used as interactive, expandable tools, the printed version was a practical necessity of the first round. As a result, the large amount of information summarised on one A4 page led to a small font size limiting the readability of the content. This gave one GP the impression that the content was less important and even neglectable. However, clinicians had overall positive feelings towards the refined visual design in the second round where mostly the digital infographics were used. Another concern about the layout was the atypical and inconsistent colour combinations and lack of contrast.

Without glasses, I can’t read it. (round one, 46-year old woman, rural group practice)

When it is color on color, it is more difficult to read it and you drop out more quickly. (round one, 64-year old man, urban solo practice)

The preference for paper or digital medium was mainly based on habit except for one clinician who was less digitally skilled. There were no concerns about the availability of the digital platform.

How come that this is more time consuming for me? Because I’m less skilled with the computer. (round one, 46-year old woman, rural group practice)

Credibility

The vast majority of clinicians perceived the infographics as trustworthy and they were unanimous in their confidence in The BMJ. This led to most GPs focusing on the ‘main message’, as they were overall less concerned with the underlying evidence. Beneficial to trustworthiness was the inclusion of the infographic in the evidence linker of the EHRs in round two.

but since it’s made by BMJ, which for me is a very trustworthy source. (round one, 60-year old woman, urban group practice)

The GPs expressed more trust in the data and recommendations if it aligned with their own standard of care. However, this could backfire when they disagreed with the recommendation. Confusion about certain data or scales shown in the infographic could also lead to a diminished trust in the infographic as a whole.

It confirms somehow what I do in practice, so that’s why I can have confidence in it. (round one, 38-year old woman, rural group practice)

I myself can’t agree that people with meniscal tears are treated conservatively. (round one, 70-year old man, rural duo practice)

Several clinicians struggled to interpret the different degrees of evidence supporting the recommendations. While some even admitted to have glossed over this aspect completely, those who did pay attention had a clear preference for strong recommendations. Weak recommendations were often perceived as a validation of lingering doubt regarding the subject (eg, PSA screening). Several doctors described discomfort with the inclusion of weak evidence. Other doctors, however, saw weak recommendations as beneficial as it gave them more flexibility in their interpretation.

If the conclusion is less clear or, let me put it this way, less pronounced, well yeah then it raises doubts a bit. (round one, 38-year old woman, rural group practice)

Discussion

Main findings

We tested the user experience of Belgian GPs using five translated RapidRecs infographics, in two consecutive iterations. To our knowledge, this is the first study to perform an iterative and comprehensive user testing using a hybrid think aloud method for the evaluation of infographics as evidence summaries in GPs. A summary of results can be found in table 3.

The GPs had an overall positive experience using the infographics. They provided the right information quickly and were easy, pleasant and intuitive to use. The digital interactive versions were preferred, as they provided expanding information if necessary but had the benefit of also providing a clear core message at a glance. Complex colour schemes were found to be confusing as meaning was sought in them. Even though graphically represented, GPs still had troubles understanding terminology related to evidence appraisal and unfamiliar scales, as well as applying statistics to the individual patient. Access through the EHR was found to be very supportive. The infographics were found to be very trustworthy and GPs recognised their potential in daily practice. A discordance with local guidelines or GPs own views seemed to be important barriers to implementation of the recommendations illustrated by the infographics.

Comparison with other literature

Our study confirms previous findings that physicians perceive infographics as enjoyable, easy to read and user-friendly.27–30 As supposed by previous authors, they did seem to offer a rapid information retrieval and hence are potentially promising tools to increase the ease of keeping up-to-date with guidelines.17 23

Many of the comments we observed can be related to the impact of the infographics on time investment of GPs. They wanted to be able to see the core message in a glance, wanted rapid access through the EHR and desired a clear colour scheme and uniformity between infographics to be able to quickly move towards the needed information. GPs often face time constraints and need to gather answers to clinical questions rapidly, as they often occur at a point of care.48 49 For that matter, GPs prefer short guideline recommendations that are easy to understand.7 50 51 In our study, GPs tended to mainly focus on the core message and follow the recommendation depicted by it. This acknowledges previous concern that the use of infographics have the risk of conveying information in an oversimplified manner, losing sight of important nuances of the underlying studies.25 This is particularly important for ‘weaker’ recommendations, where more information, such as health-related and risk-related statistics, is needed to be able to make shared decisions with patients.52 Providing an expansion with deeper explanation prompted some GPs to delve into this when necessary, though some found the recommendation alone to be sufficient. This is of great importance for future infographic development for GPs, as effort should be put in conveying a clear and correct core message, while also encouraging GPs to delve into the specifics, especially when recommendations are not strong and unambiguous.

In our study, GPs found the infographics to be rewarding in meeting their information needs. They felt they had learnt new things and were able to provide more information to their patients. This perception of increased knowledge was also found in another study when using infographics.29 This contrasts however with yet another study where only poor increase in knowledge was actually measured when infographics were used.30 Infographics have also failed to stand out to other, more simple formats regarding that matter.28 30 An explanation for the discrepancy between the perception of increased knowledge versus actual knowledge retention, might be found in the time course of information needs. Primary care physicians encounter an enormous variety of clinical cases every day. Infographics might be more supportive in their decision-making by providing just-in-time, tailored and evidence-based information, rather than being used as tools to increase their general knowledge in the long-term. Integrating access to the infographics in the EHR, linked to a coded diagnosis, was hence found very useful by physicians. Previous studies have also introduced and evaluated so-called ‘infobuttons’ and found that physicians use these EHR-integrated infobuttons for short, tailored searches.53 54 Other formats, such as scientific abstracts or plain language summaries, might be as effective as the infographics in long-term knowledge retention and are less costly and time consuming to make.

Lack of agreement, lack of adaptation to local guidelines and lack of strong recommendations were mentioned as barriers to the use and implementation of the infographics and their recommendations. They concur with those seen in guidelines and are hence not specific to the infographic format.14 50 Lack of agreement can be provoked by a lack of adaptation to local guidelines, as geographical variations in healthcare delivery have been widely documented.55 Unclarity or ambiguity are also known to decrease adherence.50 This might explain why weak recommendations were less adopted, as they provided less confidence and even confusion. Ambiguity is however meant for in the weak recommendations, as no one answer is the right one and patient values and preferences should be taken into account. It is possible that the GPs in this study were used to following strong guidelines and still have to become accustomed to weak recommendations.

It is striking that, even with the graphical representation, GPs have difficulties understanding terminology related to GRADE (such as strength of recommendations, evidence certainty or quality), unfamiliar scales and certain statistics. This means that the way these formats are displayed, or even the choice of words conveyed by a translation, play a role in understanding. The GRADE working group is aware of these challenges and a whole body of evidence attempts to find better didactic ways to convey these concepts and their implications for practice.56–59 Statistical (and even scientific) illiteracy will probably not be solved by infographics alone. It is caused by a plurality of issues, such as the still existing paternalistic nature of the doctor–physician relationship, where trust in authority makes statistical literacy ‘unnecessary’, as well as the influence of determinism, where physicians seek causes instead of probabilities, and the illusion of certainty, where patients seek certainty even when there is none.19 Even though maximal effort should be put in making these terms and numbers as understandable as possible, infographics might not be the ideal tool to also educate GPs on these issues, especially since GPs preferred the infographics as concise as possible. They could however provide support, by linking to training courses or further explanation and encourage physicians to explore these.

Strengths and limitations

The strengths of our study include the thorough process of user testing analysis. By repetitively analysing the interviews in different cycles with at least two researchers for each interview, followed by group discussion and reflection, we made sure no findings could be lost in the process. We also performed a sufficient amount of user testing in which we were able to vary in age, gender and type of GP (see table 2) to have a broad range of opinions and experiences. None of the members that performed the interviews and the analysis were part of the organisation that designed the infographics, which added to objectivity. By combining think aloud interview techniques with a previously set interview guide, we balanced capturing immediate reactions and flow of thought with affording participants the opportunity to reflect more thoroughly, thereby facilitating a more nuanced understanding of their experiences. The infographics were also used and evaluated in a natural environment, making the findings more applicable to real life setting.

Previous investigations indicate that the design of knowledge transfer tools should be based on specific preferences and needs of the users.27 28 60 61 User testing in a specific setting such as primary care, is hence very informative in further development of the infographics. In the past, most guidelines have had a content-based approach for testing, to check whether appropriate information is being given. User testing however analyses quantitative and qualitative findings of the experience of healthcare professionals and permits modification afterwards, many cycles consecutively.62 It has been proven to increase information retrieval and comprehension by healthcare professionals and has resulted in safer care as well.63 64

Our study is limited in what we can learn from it due to the methods used. We explored experiences of GPs after having used the infographics. We did not investigate whether these tools actually succeeded in improving knowledge. We did not collect quantitative measures, such as actual use or impact on physician’s behaviour or on health outcomes. We also did not compare the infographics to different formats, so we cannot form any conclusions on their relation to experiences with other tools.

Our study was limited to Dutch-speaking, Belgian GPs. A similar study might yield different results in other countries with different cultures or healthcare systems. Furthermore, our GPs were all trainers of GP trainees. They might possibly be more open to new and innovative approaches than the average GP.

Another limitation is that we handed out static, non-interactive infographics to the GPs. Findability, as one of Morville’s honeycombs, was therefore difficult to evaluate. Another study design should be set-up to further investigate this.

The second round of this study was performed during the COVID-19 pandemic. To our knowledge, the COVID-19 measures had no substantive effect on our results with a good variety in topics discussed and consultations with patients. The interviews could still be done face-to-face, although wearing facemasks.

Finally, our study design included a training session before using these infographics, which is not how they will be accessed in real life by practitioners online. This was necessary to use the more complex printed version in the first round, but may have affected their user experience in an unpredictable way. Nevertheless, the main findings of our study likely apply regardless of whether such training has occurred or not, and is consistent with previous findings.

Implications for further research and development

Based on our findings, we are able to provide some rules that can guide future developers of guideline infographics targeted at GPs:

  1. Have time in mind. GPs only have limited time and use the infographics mainly as rapid, just in time information for decision-making. Provide a clear core message at a glance, rapid access through an EHR and an intuitive design where further information can be accessed on demand.

  2. Carefully describe the core message. GPs tend to mainly focus on the core message and often neglect more detailed information. Make sure the core message conveys what is meant and avoid risk of different interpretation or apparent certainty in recommendations while there is none. Encourage GPs to delve into the details for a more nuanced overview of the evidence.

  3. Consider statistical and scientific illiteracy. As many other physicians, GPs often have insufficient skills in statistics and evidence appraisal. Even though important, presentation format alone might not be sufficient to support correct understanding of the recommendations. An additional course or other form of education should ideally be encouraged.

  4. Adapt to local guidelines. Many GPs follow local guidelines. Credibility and usefulness of the infographic might lower significantly if not adapted. Incorporate them or justify why you deviate from them.

Further user experience evaluation of infographics developed for physicians is needed. Many infographics are being developed yet follow-up on their impact is lacking. In this study, we only investigated one format of infographics. Comparison with different formats might be contributive, as well as investigation of knowledge retainment, guideline adherence and health outcomes. This study should be seen as part of a continuous process with each iteration necessitating further user testing.62 We recommend further user testing in a broader range of GPs and specialists, as well as in researchers and policymakers who might benefit from infographics as well. We are aware of another group working with the BMJ Rapid Recommendations conducting similar user testing among hospital-based doctors.

Conclusions

Infographics can be useful tools in daily primary care, as they can offer an enjoyable and visually appealing format for rapid retrieval of information on guidelines and recommendations. The infographics were found to be most useful for rapid, tailored and just in time information retrieval, which was supported by a clear core message, an intuitive design and integration in the EHR. Terminology regarding evidence appraisal and statistics remained difficult even with the infographics. Lack of consideration of local guidelines led to frustration. Further user testing in different contexts, comparison with different formats and impact on quantitative measures such as knowledge retention and health outcomes are needed.

Data availability statement

Data are available upon reasonable request. The data sets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Ethics statements

Patient consent for publication

Ethics approval

The study protocol was approved by the The Research Ethics Committee UZ/KU Leuven (Belgium) on 31 October 2019 with reference number MP011977. Written informed consent was obtained before the general practitioners’ and patients’ participation in the study. All methods were carried out in accordance with relevant guidelines and regulations. Participants gave informed consent to participate in the study before taking part.

Acknowledgments

We would like to thank the general practitioner-trainers for their contribution. We thank Lisa van der Auwera for helping us perform the forward–backward translation of the infographics. We thank the people at CEBAM and ebpracticenet for making the infographics available through the evidence linker. We thank MAGIC and the BMJ for providing the investigated infographics.

References

Supplementary materials

Footnotes

  • Twitter @ThomasAgoritsas

  • Contributors BA, MVe, GEB, ND and TA were involved in the design of the study. PVB, NS, MVa, AV, NVDA, AH, CD, EO, WM and WS-T collected the data. PVB, CD, EO, WS-T and WM analysed the data and wrote the initial manuscript. PVB, BA, MVe, GEB, ND, WS-T and TA revised the manuscript. PVB wrote the final manuscript. MVe was responsible for the overall content as guarantor. All authors read and approved the final manuscript. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted. The lead author affirms that the manuscript is an honest, accurate and transparent account of the study being reported. No important aspects of the study have been omitted. Any discrepancies from the study as planned and registered have been explained.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests TA is co-leading the BMJ Rapid Recommendations, and TA and WS-T have co-designed all infographics. The other authors declare no other competing interests.

  • Patient and public involvement Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.