Building a Bridge
14%: The average number over 17 years of new medical discoveries to make it into usual practice.
Published in the Winter 2012 issue of SSA Magazine
New approaches help practitioners use evidence-based practice
Practitioners have access to an ever-growing arsenal of effective behavioral health interventions for concerns ranging from trauma to parenting, and technological advances have facilitated the identification, communication and synthesis of large amounts of research information. These exciting advances have emerged in the context of increasingly limited resources and a demand for accountability in social and health services.
Despite these pressures, opportunities and advances, the regular movement of findings from research into practice has proven to be challenging. In fact, the gap between research and practice is so wide that it has been characterized by the Institute of Medicine as a chasm. Research in healthcare suggests that only a fraction of clinical practices are based on evidence, and one study over 17 years found that an average of only 14 percent of scientific discoveries made it into usual practice. I would expect that number to be even lower for behavioral health interventions.
One approach to crossing the research chasm is evidence-based practice (EBP). Social work has talked about moving knowledge from the academy to the clinic for a long time, and the term evidence-based practice is often used loosely as shorthand for this concept. EBP, however, is a formal process, originally outlined in medicine in the late 1990s and later adopted in social work, that utilizes a specific framework to accomplish these goals.
Even though EBP is a relatively new approach to service delivery, its meaning has evolved and transformed over time. At the outset, the EBP process involved making practice decisions through a series of steps based on the intersection between best available research evidence, client needs and preferences, and practitioner expertise.
These steps emphasized the identification and evaluation of research evidence, but provided little guidance to practitioners related to implementation. For example, one critique of the early model of EBP is the lack of recognition of the realities of the practice environment and the complexity of the implementation process. Many evidence-supported interventions include expensive trainings and materials, require highly skilled clinicians, take time to implement, and are difficult to assess in terms of the full cost to a service provider. A lack of funding and time are two of the most often cited barriers to the implementation of EBP.
Another challenge has been the emergence of the list approach to EBP, where states, funders or other authorities select and mandate a limited collection of “evidence-supported interventions.” Although this may be an intuitively simple approach to EBP—use interventions that have been shown to work—the approach has a number of pitfalls and has likely encouraged some negative attitudes toward EBP. Even the best interventions don’t work for everyone, and therefore a variety of treatment options are needed. Furthermore, many intervention models have been tested with limited client populations and practice contexts, and few include strong guidance regarding acceptable adaptations to fit diverse clients, communities and service providers. The list approach can also stifle innovation in the field by limiting practitioners who wish to develop and test novel interventions.
Fortunately, alternatives to the list approach to EBP and improvements to the original EBP process model are continuously being developed, along with training supports and other resources. I’m excited in particular about two new developments in the field.
The Transdisciplinary Evidence Based Behavioral Practice model (EBBP) integrates advances made in EBP across social work, nursing, medicine, public health and psychology, taking advantage of each of the allied health profession’s strengths and advances. Lead by Bonnie Spring at Northwestern University, the EBBP Project uses a team science approach, with the involvement of the leading thinkers in EBP in each of these disciplines. The project identifies training gaps and creates learning resources to facilitate research-to-practice translation across disciplines and has developed a free, high-quality, online training environment on this transdisciplinary approach (ebbp.org).
The Common Elements approach to EBP presents an alternative to implementing entire evidence-supported intervention packages. In this approach, evidence-supported interventions are broken down into their key clinical ingredients so that service providers can make mindful adaptations based on the service context and clients needs. A team led by Bruce Chorpita at UCLA has developed training and an online support toolbox for practitioners called PracticeWise (practicewise.com), to support the common elements approach.
I include information about both these emerging approaches when I teach, and I encourage others to take a look as well, especially practitioners in the field. To successfully cross the research-practice chasm, a multi-component, innovative and sustained effort is needed. It’s likely that it will take some time to build that bridge, but when done properly, social service clients will benefit from access to the best interventions and research we have available. In this way, the effort is well worth it.
Jennifer Bellamy is an assistant professor at SSA.