Article Text

Download PDFPDF

So much research evidence, so little dissemination and uptake: mixing the useful with the pleasing
Free
  1. Charlotte Waddell, MSc, MD, CCFP, FRCP(C)
  1. University of British Columbia Vancouver, British Columbia, Canada

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

To the continuing consternation of many health scientists, their best research results, the fruits of much caring toil and labour, often appear to remain unused by health clinicians and policy decision makers. Despite the volumes of research evidence available, relatively little is disseminated and taken up or applied in practice.13 These dissemination and uptake problems are neither new nor unique. The literature from many disciplines is replete with examples of new research findings not being widely used in decision making, sometimes for decades or more.4, 5 The problem, however, has been noticed more acutely in the health disciplines in the previous decade with the widespread adoption of “evidence-based” approaches, and with the ensuing concern that health practices and policies should be based on the best research evidence available.1, 2

Research dissemination and uptake problems have been particularly well described for physicians. Numerous studies have documented that physicians have difficulty applying new research findings in clinical practice, even when these findings are packaged in ready to use formats such as clinical practice guidelines.1, 2 The same problem likely exists for nurses, social workers, and other health practitioners also. The problem of research dissemination and uptake becomes considerably more complex when other kinds of health research audiences, such as administrative and legislative policy decision makers, are considered. These groups are even more diverse than clinical practitioners regarding their research information needs, and regarding barriers to and incentives for research dissemination and uptake.1, 6 These groups have certainly been less well studied.1, 6

What factors contribute to the problem with research dissemination and uptake—with clinical as well as administrative and legislative decision makers? Why does the problem persist in the health fields (including mental health)? Are any new directions or strategies emerging to help to alleviate the problem? These questions will be considered from a broad health policy perspective.

With over 2 million articles published annually in over 20 000 health related journals,7 the problem cannot be due to insufficient quantities of research evidence. In fact, the burgeoning quantities of research evidence are part of the problem, overwhelming clinical practitioners and other decision makers with too much evidence. These large quantities of research evidence are also scattered across multiple sources and in different media, making convenience of access an issue.

Although the quantity of research evidence may be overwhelming, the quality may not be, however. It is unclear how much of the research evidence published annually is of sufficiently high quality (with respect to scientific rigour and practical relevance) to merit decision makers changing their practices or policies in response. To complicate matters, particularly in mental health, scientific evidence is still lacking, or is still controversial, for application to many important problems.

Several efforts have been made to address the issue of overwhelming quantities (as well as the issue of uncertain quality) of health research evidence by summarising research findings in succinct “evidence-based” formats, particularly for clinical decision makers. “Evidence-based” journals (such as this one) screen the research evidence, sorting the higher quality wheat from the voluminous chaff to present readers with a digestible distillation of the latest and best research evidence.8 Systematic reviews are promoted as another way to assemble critically appraised scientific evidence on a given topic.7 Systematic reviews are available through centralised sources such as the Cochrane Collaboration,9 and are increasingly recognised in some (but not all) mental health and other health related journals. Clinical practice guidelines, finally, are widely touted as a tool to summarise the best available research evidence together with recommendations for practitioners. Myriad practice guidelines now inundate the clinical landscape, published in various journals and in specialist or discipline specific communications, some more “evidence-based” than others.2, 10

How well do these “evidence-based” summarising approaches actually work to influence decision making or behaviour? Clinical practice guidelines have been studied the most extensively, particularly with clinical decision makers. Most studies have found that clinical practice guidelines have only moderate effects on behaviour—practice guidelines do not change practice.2, 10, 11 Little attention has yet been focused on the effect of guidelines on other kinds of users, such as policy decision makers. The other summarising formats (“evidence-based” journals and systematic reviews) have not yet been well evaluated. Even if these are all excellent methods to summarise critically evaluated research evidence, the problem still remains: simply providing good quality information (even if it is “evidence-based”) is not enough to change behaviour. Summarising good quality research evidence is a necessary first step, but not necessarily a sufficient one.

As a second step, more active dissemination approaches are needed to ensure that good quality research evidence is actually used. Several more active dissemination approaches show promise in studies with clinical decision makers: audit and feedback, use of opinion leaders, and academic “detailing.”1 Even these active approaches, however, do not always result in behaviour change.1, 2, 4, 10, 11 Much less is known about effective dissemination approaches with administrative and legislative decision makers. Furthermore, little is known overall about which dissemination approaches work best with which decision makers in which kinds of settings.1 Clearly, a gap remains with respect to getting good quality research evidence not only published and summarised but also put into practice.

With relatively little to go on empirically (at least so far) to explain and address the problem with research dissemination and uptake more effectively, another starting point may be to examine some of the differences in the underlying systems or “cultural” contexts in which researchers, clinicians, and policy decision makers operate. Each work and think in different social settings where different ideologies (values, beliefs), institutional structures, and interests and incentives apply.1 These social systems or contexts influence not just the way people think and work in general, but also the kinds of research (and other) evidence and the kinds of communication formats that are preferred, needed, or used. The table (modified from Lomas1) suggests some of these contextual or systematic differences for researchers, clinical practitioners, and administrative and legislative decision makers.1, 6

As the table suggests, researchers and various kinds of decision makers each operate in different social and organisational settings that rely on different kinds of evidence, preferably received through different kinds of communications formats. Part of the problem with research dissemination and uptake may be that there is relatively little intersection or overlap between the context specific needs and the types of evidence and communications formats that are preferred or used by each of the different groups. Researchers, clinicians, and policy decision makers have also often had only a limited understanding of the constraints and context issues affecting the others,1 adding to the problem.

Researchers, governed by principles of academic excellence, have often acted as if reality is highly rational and as if (research) evidence flows “top-down” in one direction, such that simply supplying good quality research evidence should be sufficient to motivate people to change their behaviour. Researchers have usually been mostly answerable to their peers in discipline specific communities. Many researchers have known relatively little about decision makers' settings, which often involve the need to respond rapidly to complex crises or events.1 Many researchers could also be better equipped to communicate more effectively and more frequently in clear (plain language) terms, particularly in the media and through personal contacts with decision makers.1, 6, 12

Decision makers, on the other hand, often operate in settings where clinical and programme or policy problems need to be solved quickly and cost effectively.1, 6 They often deal with multiple problems from multiple perspectives. They may not always appreciate the need for the careful, curiosity driven research work on basic science questions that often serves as a foundation for later applications to practical problems. They may often be frustrated by the slowness and seeming irrelevance of many researchers' approaches. Decision makers have also frequently had a limited understanding of the work settings for many researchers,1 particularly those based in universities where funding and publication processes can be slow, and where dissemination and policy oriented work is often poorly rewarded.

Researchers and the various decision makers appear to have functioned in their own solitudes according to their own cultural rules and mores.1 Researchers want their findings to be used, and decision makers need research findings to solve clinical, programme, and larger policy problems. Decision makers, especially those working in governments concerned with fiscal restraint, also increasingly want to ensure effective outcomes with their policies and programmes, which usually requires research evidence input.6 Decision makers also have extremely useful ideas to contribute to formulating and answering research questions. To use a physiological analogy, however, there has been little or no connective tissue between these various solitudes. It is little wonder that research dissemination and uptake have been relatively intractable problems in health related arenas.

Fortunately, several potential strategies are emerging that may help to alleviate these problems and connect the solitudes. Overall, there appears to be a clear need to develop new structures and processes—connective tissue—to explicitly facilitate the flow of relevant information and influence between all the concerned parties or stakeholders. As well, other stakeholders such as funders, universities, independent research and policy advocates, and patients and families need to be brought into the mix more effectively.

In terms of new structures and processes, researchers (particularly those working in applied areas) could equip themselves to communicate more effectively and more regularly with decision makers. To do this, many researchers could benefit from better training and support to work with media, community, and policy groups.12 Researchers could advocate for communications and policy work to be better recognised and rewarded (or at least not penalised) by universities and other host institutions. Researchers could also (as many already do) involve themselves in more multidisciplinary activities that are problem focused, as opposed to working in “silos” of specialisation.1

Most importantly, researchers could explore more collaborative models that explicitly involve practitioners and policy makers—who are working in contexts where the research needs to be used—as meaningful partners in all stages of research. Research partnerships also need to be built with patients, families, and community leaders.

Administrative and legislative decision makers and their agencies or host institutions could organise more effective structures and processes to systematically seek and incorporate relevant research information and maintain links with key researchers.1 This is already being done, for instance in Canada, by provincial and federal government agencies that have established ongoing “expert” advisory councils, or have funded ongoing partnerships with university research units to focus on applied questions of interest to government.6 Decision makers have a role to ensure that the people and the organisational structures are in place to actively disseminate research evidence, and to work with researchers to ensure relevance.

Decision makers, particularly at the administrative levels, would also benefit from more training to be better equipped to critically appraise research evidence. Similarly, basic training for all clinical practitioners needs to equip people to incorporate and apply research evidence more easily. Continuing professional development activities need to emphasise these skills as well, through professional, college, maintenance of certification, and continuing education activities.

Research funders are another group with a part to play. Funders could provide substantially more incentives for research dissemination and uptake activities, beyond the usual peer reviewed requirements. More collaborative (as opposed to “top-down”) research models could be actively encouraged that, again, include meaningful partnerships with decision makers in all aspects of the research process, particularly for more applied kinds of research. This approach is currently being encouraged by some funders. For instance, in Canada, agencies such as the Canadian Health Services Research Foundation13 require research applicants to involve decision makers in meaningful ongoing partnerships, and expect decision makers to facilitate dissemination and uptake activities within their own organisations. Funders could also greatly assist researchers with incentives to work in multidisciplinary problem oriented teams. Although these suggestions should never supplant the need for ongoing (and probably separate) funding for basic science or curiosity driven research, with incentives and assistance from funders, a great deal more research evidence could be applied more quickly. Finally, much more funding is needed for research that specifically focuses on dissemination and uptake issues. Funders could assist here, too.

Universities and their governing bodies also have influence. These groups could do more to reform academic incentive systems to reward a wider range of communications and policy activities, again, particularly for applied researchers. Both curiosity based and applied research will always be important, but most university incentive systems currently reflect traditional academic models most suited for curiosity based research. More balance and more variety are needed if decision makers' needs are to be better taken into account, and if more research evidence is to be better disseminated and used. Universities could also greatly assist researchers with training and support for communications and policy activities, using economies of scale not available to individual researchers.

Another strategy has emerged with help from both the private and the public sectors. Independent research and policy advocacy groups can (and do) provide independent funding for certain kinds of research, and maintain active relationships with government to promote awareness and use of research findings.1, 6 Particularly effective (Canadian) examples of this include the Canadian Institute for Advanced Research and the National Centres for Excellence, where researchers have been funded and linked in networks to work on common problems, and where advocacy with decision makers has been supported without sacrificing academic excellence.14

Yet another strategy involving all the parties discussed so far has been proposed but not yet widely implemented.1, 15 It is a strategy that requires new functions and structures to be created to specifically train and employ people—“knowledge brokers”—who are both research literate and knowledgeable about the settings and needs of decision makers. These “knowledge brokers” would also need to be highly skilled at communicating with all the parties involved as well as with the media.1 12 15 Currently, it is nobody's job to disseminate research evidence, or to ensure that decision makers' needs are brought into the research process in a systematic way.

In addition to connective tissue, strategies are needed to coordinate information systems overall. Currently, no coherent or national approach exists (at least in Canada) for exchanging ideas and information between disparate groups in an organised or meaningful way. We especially lack mechanisms for busy practitioners and decision makers of all kinds, as well as researchers, to access and exchange information quickly and easily. Newer media such as the internet have added a plethora of choices but not necessarily the critical perspectives or efficiencies that are needed. We need more initiatives coordinated at a national level to simplify and streamline information exchanges between all the concerned groups, including patients, families, and community leaders.

To work well, these strategies will require shifts in thinking on everyone's part: to be more aware of others' settings, needs, and constraints; to think in more relational ways; and to recognise that everyone is part of the same overall system, often working on the same long term goals (such as the betterment of health or mental health), albeit in different ways. These strategies require a commitment to the importance of both scientific rigour and policy relevance. Solving difficult health problems certainly requires both. These strategies also require shifts in the use of resources to explicitly acknowledge that communications structures and processes need to be established in a more organised and cross contextual way than they are at present.

Finally, in case these proposals sound utopian, it must be acknowledged that changes of this magnitude will take time. And ultimately, human institutions and behaviours will always be shaped by processes that are (at least in part) non-rational. In the spirit of both the rational and the non-rational, we would do well to reflect on the still germane comments of Cervantes' Don Quixote:

There have been many who, not knowing how to mingle the usefulwith the pleasing in the right proportions,have had all their toil and pains for nothing.Miguel de Cervantes. Don Quixote, 1620.

References