Show Summary Details

Page of

Printed from Encyclopedia of Social Work. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 17 April 2024

Evidence-Based Practicefree

Evidence-Based Practicefree

  • Jeffrey M. JensonJeffrey M. JensonUniversity of Denver
  •  and Matthew O. HowardMatthew O. HowardUniversity of North Carolina at Chapel Hill


Evidence-based practice (EBP) is an educational and practice paradigm that includes a series of predetermined steps aimed at helping practitioners and agency administrators identify, select, and implement efficacious interventions for clients. This entry identifies definitions of EBP and traces the evolution of EBP from its origins in the medical profession to its current application in social work. Essential steps in the process of EBP and challenges associated with applying EBP to social work practice, education, and research are noted.


  • Research and Evidence-Based Practice

Evidence-based practice (EBP) is a five-step process used to select, deliver, and evaluate individual and social interventions aimed at preventing or ameliorating client problems and social conditions. At its most basic level, EBP seeks to systematically integrate evidence about the efficacy of interventions in clinical decision-making. Adhering to EBP, however, is a complex process that requires practitioners to be skilled at posing practice-relevant questions and proficient at accessing evidence that answers these questions. Importantly, practitioners must have the requisite methodological skills to evaluate evidence about the efficacy of interventions from clinical trials, systematic reviews, and meta-analyses. Finally, to teach the process of EBP, social work educators must be competent in tasks associated with information retrieval and interpretation of evidence.

A recent surge of interest in EBP is raising awareness about the importance of considering empirical evidence in selecting interventions among practitioners who may not have considered such evidence in the past. At the same time, the sudden growth of EBP gives rise to a cautionary note about the many different ways that EBP is being defined in published works and taught in the classroom. A consistent definition of EBP and an educational commitment to the process steps required in EBP are critical at this juncture to prevent the misuse or misunderstanding of this new paradigm.

Definitions and Evolution of EBP

EBP appeared in the medical profession in the 1990s as a process to help physicians select effective treatments for their patients. The introduction of EBP in medicine was viewed by many scholars and practitioners as an effective way to bring research findings to medical practice decisions. The rapid diffusion of EBP since then has been attributed to advances in knowledge about the prevention and treatment of medial conditions and to economic forces that emphasize the selection of efficacious treatments as a strategy to reduce health care costs (Gray, 2001).

The growth of EBP in medicine has also been a product of an increasingly active and well-informed patient population. Unlike prior generations, a significant portion of today's patients are well educated about their medical problems and demand that they receive the most optimal treatments for their conditions. The sophistication of medical consumers has required physicians to become more skilled at evaluating and applying evidence to medical practice decisions (Gambrill, 2006; Gray, 2001; Wennberg, 2002).

Definitions and perceptions of what EBP is—and what it is not—vary widely. In what is arguably the most widely accepted definition of EBP, Sackett and colleagues (Sackett, Straus, Richardson, Rosenberg, & Haynes, 2000) state that EBP is “the integration of best research evidence with clinical expertise and [client] values” (p. 1). In this definition, EBP is implied to be a process characterized by specific steps and actions. In an earlier publication, Sackett, Richardson, Rosenberg, and Haynes (1997) had defined EBP as “the conscientious, explicit, judicious, use of current best evidence in making decisions about the care of individual [clients]” (p. 2).

The introduction of EBP in medicine has created considerable interest in the process of applying evidence to medical practice decision-making. Importantly, scholars also believe that EBP has moved the medical profession away from its long-standing reliance on authority-based decision-making processes that fail to adequately consider empirical evidence (Gambrill, 1999, 2005).

Essential Steps of EBP

As the above definitions imply, EBP is both a philosophy of practice and a process that implies a series of structured steps. Sackett et al. (2000) have been credited with developing the five essential steps of EBP.

Step 1: Converting Practice Information Needs into Answerable Questions

An important first step in the process of EBP requires practitioners to define information needs about a particular client problem. Sackett et al. (2000) suggest that this information needs to be framed in the form of answerable questions. Further, they recommend that questions identify the client population, intervention type, and anticipated outcomes.

Several scholars have brought elements of this first step in the EBP process to social work. In an important book on the subject of EBP, Gibbs (2003) identified a framework for posing questions that emphasizes the need for practicality. According to Gibbs, questions must be client-oriented and they must be specific enough to guide a search for evidence using electronic resources. Gambrill (2005) summarized effectively the types of questions that are generally posed in EBP processes. Her synopsis includes the following question types: 1) effectiveness, 2) prevention, 3) assessment, 4) description, 5) prediction, 6) harm, and 7) cost-benefit.

Framing practice-relevant questions is the foundation of the EBP process. Questions must be specific and posed in terms that lead to a rational search for evidence. An illustration of an effectiveness question may be helpful in understanding the importance of this point. Suppose a practitioner in a substance abuse program is interested in knowing whether a cognitive-behavioral intervention is more effective than a 12-step treatment program for addressing alcohol abuse in adults. In this case, a logical practice question might be: Is a structured cognitive-behavioral intervention more effective than a self-help program in treating alcohol abuse in adults? In a second example, suppose practitioners and teachers in a local elementary school are concerned about the negative effects of bullying behaviors in the classroom. In this example, a school social worker might pose a question about the best way to address aggression. A typical question might be: Is a universal prevention approach aimed at changing social norms about aggression more effective than a skills training approach that seeks to reduce aggression by targeting only high-risk youth?

Posing answerable questions requires precision and practice. Students and practitioners must be trained to pose different types of practice-relevant questions and learn ways to retrieve evidence that is critical in answering such questions.

Step 2: Locating Evidence to Answer Questions

Step 2 requires practitioners to search for and locate evidence pertaining to the questions they pose. At least four sources are available currently to search for empirical evidence: 1) books and journals, 2) systematic reviews organized by client problem or treatment approach that detail the effects of interventions on specified outcomes, 3) published “lists” of effective programs by federal entities and research centers, and 4) practice guidelines that offer treatment protocols based on empirical evidence.

Books and Journals

Books and journals represent a traditional approach to answering practice-relevant questions identified in step 1. Printed books and journal articles are readily available and have traditionally been helpful information sources. However, practitioners must also be aware of the limitations inherent in books and journals. For example, there is often a significant time lag between the submission and subsequent publication of a book or journal article. Practitioners must also have the skills to identify and discern published findings that pertain to their questions. This requires knowing how to select and search appropriate databases for information. In addition, practitioners must be trained to recognize that findings reported in book chapters and other outlets are quite likely not subject to peer review processes.

A final limitation of books and journals as information sources relates to the types of articles commonly published in social work. For example, at least one investigation has revealed that relatively few intervention outcome studies are published in social work literature (Rosen, Proctor, & Staudt, 1999). The lack of outcome studies poses a limitation to practitioners searching for evidence pertaining to the efficacy of interventions.

Systematic Reviews

Systematic reviews are comprehensive evaluations that examine evidence about the effectiveness of interventions targeted to a range of client populations and problems. Leadership in disseminating knowledge of effective prevention and treatment approaches through the publication of systematic treatment outcome reviews has come from international interdisciplinary teams organized under the Campbell Collaboration (2007, and the Cochrane Collaboration (2007, Each of these groups disseminates the results of systematic reviews to inform practitioners about the effects of interventions in health, behavioral, and educational settings. Importantly, systematic reviews of treatment outcomes are also becoming more available in social science literature (Vaughn & Howard, 2004).

Lists of Efficacious Programs

A third dissemination approach has been organized by federal entities and independent research centers such as the Substance Abuse and Mental Health Services Administration (SAMHSA) and the Center for the Study and Prevention of Violence (CSPV) at the University of Colorado. For example, SAMHSA (2007, publishes a list of efficacious substance abuse prevention and treatment programs in the National Registry of Evidence-Based Programs and Practices. The agency identifies promising, effective, and model programs on the basis of methodological rigor and client outcomes. CSPV (2007, identifies effective violence prevention programs as part of its Blueprints for Violence Prevention dissemination effort. At least one group concerned with the effects of school-based educational programs for high-risk youth has also published lists of effective interventions (Collaborative for Academic, Social, and Emotional Learning, 2003).

In psychology, concern about the failure of many therapists to use empirically supported treatments led to the establishment of the American Psychological Association (APA) Task Force on the Promotion and Dissemination of Psychological Procedures in 1993 (Barlow, Levitt, & Bufka, 1999). The Task Force was established by the APA Society of Clinical Psychology (Division 12) to identify efficacious treatments across a range of mental health disorders and problems. Task Force members with expertise in diverse therapeutic approaches and populations developed criteria for treatments deemed to be well established and empirically validated and for treatments considered to be probably efficacious. Well-established treatments were those therapies that evidenced efficacy in at least two independent and rigorous experimental studies. Probably efficacious treatments were therapies in which only one study supported a treatment's efficacy, or therapies that had been tested by a single investigator (Task Force on the Promotion and Dissemination of Psychological Procedures, 1995).

The Task Force recognized randomized clinical trials as the most rigorous and acceptable method of producing empirically supported treatments. In lieu of randomized trials, findings from a large series of single case design experiments were accepted as criteria. The Task Force initiated a search for efficacious and probably efficacious treatments in 1993 (Task Force on the Promotion and Dissemination of Psychological Procedures, 1995). The subsequent list of efficacious therapies has since been updated twice (Chambless et al., 1996; Chambless et al., 1998).

Compilations of effective programs allow practitioners to access considerable information about the efficacy of interventions targeted to a wide range of client groups and problems. Credible lists such as those identified above use rigorous selection criteria to identify effective programs. For example, to be included on the program list compiled by the CSPV at the University of Colorado, intervention studies must use strong research designs and demonstrate sustained effects. Replication of effects is also required to meet criteria for the highest level of evidence. Similarly, APA criteria clearly identify the levels of research rigor that are necessary to meet standards for efficacious or probably efficacious treatments.

Lists of EBPs lead practitioners to potentially effective interventions. However, such lists cannot simply be accepted uncritically. In all cases, practitioners should scrutinize the criteria used to identify effective programs and interventions when they consider selecting and implementing programs from lists of EBPs.

Practice Guidelines

Practice guidelines are a fourth method of disseminating knowledge of efficacious interventions to practitioners. Proctor and Rosen (2003) defined practice guidelines as “a set of systematically compiled and organized knowledge statements designed to enable practitioners to find, select, and use appropriately the interventions that are most effective for a given task” (p. 108). Guidelines offer specific treatment protocols for practitioners that, when followed, mirror the strategies used in efficacious interventions with similar types of clients. Clinical practice guidelines were introduced in medicine and have recently spread to psychology and social work. Guidelines in social work have been met with mixed reaction and their development and application have been limited to date (see Howard & Jenson, 1999a, 1999b, 2003 and Rosen & Proctor, 2003 for a discussion of practice guidelines in social work).


Sources of information and evidence have proliferated widely in recent years. Practitioners must possess a range of information retrieval skills to identify appropriate sources of credible evidence. The appraisal of such evidence, discussed next, is a critical next step in the EBP process.

Steps 3 & 4: Appraising and Applying Evidence to Practice and Policy Decisions

EBP requires practitioners to use their knowledge of research design and methodology to evaluate and apply evidence to practice situations. These steps require familiarity with research methodology and the ability to draw conclusions about the utility of information on the basis of levels of evidence. The scientific community recognizes findings produced by randomized controlled trials as the most rigorous and acceptable level of evidence. However, results from studies using correlation, single-subject, quasi-experimental, experimental, and meta-analytic designs must also be considered and evaluated in steps 3 and 4 (Thyer, 2004).

Evaluating the rigor of studies and selecting interventions that meet high research standards require advanced training in methodology and intervention research. Unfortunately, current standards for research training in most Master of Social Work programs fall short of assuring the advanced skills necessary to critically evaluate the validity and applicability of research reports. Additional course work in evaluating evidence should be included in the graduate social work curriculum.

A second concern in appraising and applying evidence to practice situations comes from studies suggesting that practitioners fail to routinely consult research evidence when selecting interventions. For example, several studies show that practitioners often choose interventions for reasons other than empirical evidence (Elliott & Mihalic, 2004; Rosen, Proctor, Morrow-Howell, & Staudt, 1995). In addition, agency and organizational policies that limit the choice of intervention approaches available to practitioners often constrain practitioners' ability to use EBP.

The flurry of activity associated with EBP is not confined to selecting and implementing well-tested programs. To develop new knowledge about the effects of interventions, a small but increasing number of social work researchers are testing the effects of interventions across different problem areas in controlled efficacy trials (Reid & Fortune, 2003). This is a promising development in view of findings suggesting there is a dearth of intervention studies in social work (Fraser, 2003; Jenson, 2005; Rosen et al., 1995). More intervention research by social work investigators is needed to contribute to the knowledge base of efficacious prevention and treatment approaches.

Step 5: Evaluating the Process

The steps in EBP appear deceptively simple at first glance. However, the process of EBP requires knowledge of current literature about the onset, prevention, and treatment of client or social problems, the ability to search for relevant information and data, and skills to evaluate and apply knowledge obtained in systematic searches. The complexity involved in steps one to four demands an ongoing evaluation of one's knowledge of current literature, familiarity with constantly changing electronic databases, and skills in drawing conclusions based on methodological rigor.

Gibbs (2003) summarizes effectively the process of EBP: “Placing the client's benefits first, evidence-based practitioners adopt a process of lifelong learning that involves continually posing specific questions of direct and practical importance to clients, searching effectively for the current best evidence to each question, and taking appropriate action guided by evidence” (p. 6). Most scholars would agree that the social work profession is in the beginning stage of implementing the process defined by Gibbs in practice, and education and research settings.

Challenges and Implications

The promotion of EBP in social work was attributed initially to individual scholars and small groups of researchers (e.g., Gambrill, 1999, 2003; Howard & Jenson, 1999a; Proctor & Rosen, 2003; Thyer, 2004). These early efforts were aimed largely at exposing social workers to definitions of EBP and to concurrent developments in evidence-based medicine. Discussion of the process of applying EBP principles to social work practice and policy soon followed (for example, Bilson, 2005; Gambrill, 2003, 2006; Gibbs, 2003).

A significant number of social work researchers and educators have since acknowledged the importance of EBP. Support is evident in the exponential growth in the number of books and articles on EBP since 2003 (see Gambrill, 2005, 2007; Howard, Himle, Jenson, & Vaughn, in press; Rosenthal, 2004 for reviews). Sessions on EBP have increased significantly at recent national social work conferences sponsored by the Society for Social Work and Research and the Council on Social Work Education. Further, a 2006 University of Texas at Austin symposium on EBP signaled an increasing recognition of the importance of teaching EBP in the social work curriculum. The Austin conference led to the publication of a 2007 special issue of Research on Social Work Practice that summarized the viewpoints of presenters at the symposium. Transparency in the use of EBP in practice and education (Gambrill, in press), steps required to teach EBP (Mullen, Bellamy, Bledsoe, & Francois, in press), and structural curricular reforms consistent with EBP (Howard & Allen-Meares, in press; Jenson, in press) are among the topics discussed in that issue.

An increase in attention to EBP by social work educators is indisputable. However, EBP is not without its critics. There have been voices of skepticism (Taylor & White, 2002) and even rejection (Webb, 2001) characterized by claims that EBP offers nothing new to the field. Others point to the lack of an effective knowledge base for certain client problems and populations, which hinders the advancement of EBP in the field.

EBP is at an important turning point in social work. To some, it reflects a new and revolutionary practice approach that holds great promise for building stronger bridges between science and social work (Gambrill, 2007; Jenson, 2005). Others view EBP as a repackaged attempt to integrate research and practice that is fraught with educational and implementation problems (Webb, 2001). Regardless, the challenges of EBP to social work education, practice, and research are varied and complex.

EBP and Social Work Education

The Challenge of Educational Reform

Rubin and Parrish (2007) reported that more than 70% of respondents from a survey of social work educators were in favor of teaching EBP in the MSW curriculum. Rapp-Paglicci (2007) noted that as many as 40 social work programs have created classes that incorporate principles of EBP. At least one school of social work—the Brown School of Social Work at Washington University—has identified EBP as the organizational framework for its graduate curriculum (Edmond, Rochman, Megivern, Howard, & Williams, 2006; Howard, McMillen, & Pollio, 2003). Importantly, the Council on Social Work Education has identified EBP as an important principle in its educational policy and accreditation documents (Council on Social Work Education, 2004). These and other examples illustrate the increasing attention being paid to EBP in the social work curriculum.

It is also clear, however, that interest in EBP has not yet resulted in the adoption or implementation of significant curriculum reform. To illustrate, Woody, D'Souza, and Dartman (2006) reported less than encouraging findings from a survey of social work deans and directors examining whether and how their programs teach empirically supported interventions. Woody et al. (2006) noted that, “only 31 programs, less than half, had endorsed teaching specific ESI [Empirically Supported Interventions] content; still fewer, 26, had designated courses to teach specific ESI content; and of the 31 programs that had endorsed teaching ESI, very small numbers required ESI training materials designed for teaching students the skills and techniques for implementing the interventions” (p. 474).

Significant structural and pedagogical changes in social work education are necessary to teach EBP. For example, a new generation of students must be exposed to the complexities involved in posing relevant practice and policy questions. Students must become experts in information retrieval and possess the methodological skills necessary to evaluate and apply evidence. New and innovative teaching approaches will be required to systematically teach EBP. Faculty will need to be trained, and in some cases retrained, to teach EBP. Finally, the appropriate location for teaching the actual process of EBP must be determined in undergraduate and graduate curricula.

Teaching the Process of EBP

Above all, EBP is a process characterized by the five specific steps discussed above. Thus, a logical assumption is that educators should focus their efforts on teaching the actual process of conducting EBP. However, the degree to which faculty members in schools of social work are teaching the five-step process of EBP—or simply informing students of effective interventions—is unclear. Several scholars, most notably Gambrill (2007), caution that exposing students to only EBPs identified on compiled lists and national registries is inconsistent with the fundamental premise of EBP. She accurately notes that a singular focus on effective interventions, expressed through commonly used terms such as best practices, is taking focus away from teaching students the actual process of EBP. Gambrill (2007) further suggests that emphasizing EBPs at the cost of understanding the process of EBP is inconsistent with the original intent of EBP as an approach that fosters transparency and systematic decision-making with clients.

The importance of teaching students the actual process steps of EBP cannot be overstated. EBP is a philosophy and an approach to practice that requires students and practitioners to understand and apply its essential steps. Teaching students to identify and use lists of established EBPs to select interventions is but one small part of the EBP process. As Gambrill (2007) so eloquently states, the emphasis on EBPs “ignores the process of EBP that describes skills designed to help practitioners to integrate external findings with other vital information (e.g., concerning client characteristics and circumstances) such as posing well-structured questions, and ignores the importance of creating tools practitioners need such as access to high-speed computers with relevant databases” (p. 430).

Schools of social work must take bold steps to integrate EBP across the curriculum. Training in EBP occurs sporadically in most schools, with little consistent application across key parts of the curriculum. Therefore, discussions about the best place (e.g., practice or research courses) to teach the process of EBP in the curriculum are needed. In addition, new teaching techniques such as problem-based learning that are compatible with EBP should be examined for applicability in social work education (Gambrill, in press; Sackett et al., 2000). Finally, structural changes in long-held traditions such as advanced standing may need to be considered in the interest of increasing students' exposure to the complexities of EBP (Jenson, in press).

EBP and Social Work Practice

EBP is receiving considerable attention from local, state, and federal policy makers and funding sources. State and local systems of care, private foundations, and federal entities have entered the debate about the best ways to select and implement effective interventions for clients and client systems. Agency administrators and practitioners are working diligently to understand EBP in an effort to develop competitive research proposals and implement effective program components.

One significant practice challenge is how to teach principles of EBP to practitioners and agency administrators. Community agencies vary widely with respect to their awareness, understanding, and acceptance of EBP. Community partnerships and collaborative research projects such as those being developed at the University of Toronto (Regehr, Stern, & Shlonksy, 2007) are needed to help practitioners apply EBP principles in a wide variety of practice settings. At Toronto, the faculty of social work at the University of Toronto has created an institute for evidence-based social work that aims to develop and foster community collaborations (Regehr et al., 2007). This and similar models should be further developed and tested.

EBP and Research

EBP relies on the availability of accrued knowledge about a range of individual and social problems. Thus, it is imperative that new knowledge about the etiology, prevention, and treatment of problems be consistently developed. In this regard, rigorous research is needed across many or all substantive areas in social work. Intervention research to assess the efficacy and effectiveness of social interventions is particularly lacking. Such studies are necessary to advance the etiological and intervention knowledge bases available to practitioners who are interested in implementing EBP.

The translation of research evidence to practice and policy is a second important area of research. In many service sectors there is a considerable lag between the identification of efficacious treatments and the application of such treatments to practice and policy. Recently, entities such as the National Institute of Mental Health have emphasized the importance of translating research findings to the field (Brekke, Ell, & Palinkas, 2007). Models for translating research evidence to practice and policy in health care and adolescent service sectors have been offered by Gray (2001) and Jenson and Fraser (2006) respectively.

The careful translation of research into practice is particularly important in view of the rapid increase in practices and publications that are promoted as EBP but in reality fall short of the principles implied in EBP. For example, the sudden infusion and proliferation of terms that resemble EBP, but are not EBP, may have an adverse effect on the profession's interest in using EBP to enhance the connection between science and intervention. Phrases such as “best practices” and “exemplary programs” are frequently used for marketing clinical and community interventions. On closer examination, these terms may or may not reflect the underlying processes of EBP. In many cases, interventions packaged under such names are not based on empirical evidence and have not been subject to rigorous evaluation. Promoting untested interventions as evidence-based promotes a false sense of efficacy, erodes the basic principles of EBP, and dilutes commonly accepted definitions of EBP used in medicine and psychology.

Finally, research is needed to systematically assess the effects of implementing EBP with clients. Embedded in EBP is the notion that client outcomes will be improved significantly by using EBP. As EBP becomes more widely applied in practice, studies will be needed to assess the relationship between its use and client outcomes.

EPB offers the promise of a new approach to social work education and practice that will dramatically alter the profession for years to come. The move to EBP as a guiding educational framework will require schools of social work to include the essential elements of EBP training (e.g., posing practice-relevant questions, gaining sophisticated information retrieval skills, interpreting systematic reviews, applying clinical practice guidelines, etc.) in graduate courses. In addition, schools of social work must also assume leadership in assisting community-based and human service sectors to understand and apply EBP in practice and policy settings. The challenge and risk of such comprehensive change represents an exciting new opportunity in social work practice and education. The endorsement of EBP is a risk well worth taking.


  • Barlow, D. H., Levitt, J. T., & Bufka, L. F. (1999). The dissemination of empirically supported treatments: A view to the future. Behaviour Research and Therapy, 37, 147–162.
  • Bilson, A. (Ed.). (2005). Evidence-based practice in social work. London: Whiting & Birch.
  • Brekke, J. S., Ell, K., & Palinkas, L. A. (2007). Translational science at the National Institute of Mental Health. Can social work take its rightful place? Research on Social Work Practice, 17, 123–133.
  • Campbell Collaboration. (2007). Retrieved April 30, 2007, from
  • Center for the Study and Prevention of Violence. (2007). Blueprints for violence prevention. Retrieved April 30, 2007, from the University of Colorado Web site:
  • Chambless, D. L., et al. (1996). An update on empirically validated therapies. The Clinical Psychologist, 49, 5–18.
  • Chambless, D. L., et al. (1998). Update on empirically validated therapies. II. The Clinical Psychologist, 51, 3–15.
  • Cochrane Collaboration. (2007). Retrieved May 1, 2007, from
  • Collaborative for Academic, Social, and Emotional Learning. (2003). Safe and sound: An educational leader's guide to evidence-based social and emotional learning. Chicago: Author.
  • Council on Social Work Education. (2004). Educational policy and curriculum policy standards. Washington, DC: Author.
  • Edmond, T., Rochman, E., Mcgivern, D., Howard, M., & Williams, C. (2006). Integrating evidence-based practice and social work field education. Journal of Social Work Education, 42, 377–396.
  • Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science, 5, 47–52.
  • Fraser, M. W. (2003). Intervention research in social work: A basis for evidence-based practice and practice guidelines. In A. Rosen & E. K. Proctor (Eds.), Developing practice guidelines for social work intervention: Issues, methods, and research agenda (pp. 17–36). New York: Columbia University Press.
  • Gambrill, E. (1999). Evidence-based practice: An alternative to authority-based practice. Families in Society, 80, 341–350.
  • Gambrill, E. (2003). Evidence-based practice: Implications for knowledge development and use in social work. In A. Rosen & E. K. Proctor (Eds.), Developing practice guidelines for social work intervention: Issues, methods, and research agenda (pp. 37–58). New York: Columbia University Press.
  • Gambrill, E. (2005). Critical thinking in clinical practice (2nd ed). Hoboken, NJ: John Wiley and Sons.
  • Gambrill, E. (2006). Evidence-based practice and policy: Choices ahead. Research on Social Work Practice, 16, 338–357.
  • Gambrill, E. (2007). To be or not to be: Will five-step be used by clincians? Research on Social Work Practice, 17, 428–434. A review of J. C. Norcross, L. E. Beutler & R. F. Levant (Eds.), Evidence-based practices in mental health: Debate and dialogue on the fundamental questions. Washington, DC: American Psychological Association.
  • Gambrill, E. (in press). Transparency as the route to evidenced-informed professional education. Research on Social Work Practice.
  • Gibbs, L. (2003). Evidence-based practice for the helping professions. Pacific Grove, CA: Brooks/Cole.
  • Gray, J. A. M. (2001). Evidence-based health care: How to make health policy and management decisions (2nd ed.). New York: Churchill Livingstone.
  • Howard, M. O., & Allen-Meares, P. (in press). Teaching evidence-based practice: Strategic and pedagogical recommendations for schools of social work. Research on Social Work Practice.
  • Howard, M. O., Himle, J., Jenson, J. M., & Vaughn, M. G. (in press). Revisioning social work clinical education: Recent developments in relation to evidence-based practice. Journal of Evidence-Based Social Work.
  • Howard, M. O., & Jenson, J. M. (1999a). Clinical practice guidelines: Should social work develop them? Research on Social Work Practice, 9, 283–301.
  • Howard, M. O., & Jenson, J. M. (1999b). Barriers to development, utilization, and evaluation of social work practice guidelines: Toward an action plan for social work research. Research on Social Work Practice, 9, 347–364.
  • Howard, M. O., & Jenson, J. M. (2003). Clinical practice guidelines and evidence-based practice in medicine, psychology, and allied professions. In E. Proctor, & A. Rosen (Eds.), Developing practice guidelines for social work intervention: Issues, methods, and research agenda. (pp. 83–107). New York: Columbia University Press.
  • Howard, M. O., McMillen, J. C., & Pollio, D. (2003). Teaching evidence-based practice: Toward a new paradigm for social work education. Research on Social Work Practice, 13, 234–259.
  • Jenson, J. M. (2005). Connecting science to intervention: Advances, challenges, and the promise of evidence-based practice. Social Work Research, 29, 131–135.
  • Jenson, J. M. (in press). Evidence-based practice and the reform of social work education: A response to Gambrill and to Howard and Allen-Meares. Research on Social Work Practice.
  • Jenson, J. M., & Fraser, M. W. (2006). Social policy for children and families: A risk and resilience perspective. Thousand Oaks, CA: Sage.
  • Mullen, E. J., Bellamy, J. L., Bledsoe, S. E., & Francois, J. J. (in press). Teaching evidence-based practice. Research on Social Work Practice.
  • Proctor, E. K., & Rosen, A. (2003). The structure and function of social work practice guidelines. In A. Rosen & E. K. Proctor (Eds.), Developing practice guidelines for social work intervention: Issues, methods, and research agenda (pp. 108–127). New York: Columbia University Press.
  • Rapp-Paglicci, L. (2007). To be or not to be: Will evidence-based practice be used by clinicians? Research on Social Work Practice, 17, 427–428. A review of A. R. Roberts & K. R. Yeager (Eds.), Evidence-based practice manual: Research and outcome measures in health and human services. New York: Oxford University Press and A. R. Roberts & K. R. Yeager (Eds.), Foundations of evidence-based social work practice. New York: Oxford University Press.
  • Regehr, C., Stern, S., & Shlonsky, A. (2007). Operationalizing evidence-based practice: The development of an institute for evidence-based social work. Research on Social Work Practice, 17, 408–416.
  • Reid, W. J., & Fortune, A. E. (2003). Empirical foundations for practice guidelines in current social work knowledge. In A. Rosen & E. K. Proctor (Eds.), Developing practice guidelines for social work intervention: Issues, methods, and research agenda (pp. 59–79). New York: Columbia University Press.
  • Rosen, A., & Proctor, E. K. (Eds.). (2003). Developing practice guidelines for social work intervention: Issues, methods, and research agenda. New York: Columbia University Press.
  • Rosen, A., Proctor, E. K., Morrow-Howell, N., & Staudt, M. M. (1995). Rationales for practice decisions: Variations in knowledge use by decision task and social work service. Research on Social Work Practice, 5, 501–523.
  • Rosen, A., Proctor, E. K., & Staudt, M. (1999). Social work research and the quest for effective practice. Social Work Research, 23, 4–14.
  • Rosenthal, R. N. (2004). Overview of evidence-based practice. In A. R. Roberts & K. R. Yeager (Eds.), Evidence-based practice manual. Research and outcome measures in health and human services. (pp. 20–29). New York: Oxford University Press.
  • Rubin, A., & Parrish, D. (2007). Views of evidence-based practice among faculty in master of social work programs: A national survey. Research on Social Work Practice, 17, 110–122.
  • Sackett, D. L., Rosenberg, W., Gray, J. A. M., Haynes, R. B., & Richardson, W. S. (1997). Evidence-based medicine: What it is and what it isn't. British Medical Journal, 312, 71–72.
  • Sackett, D. L., Straus, S. E., Richardson, W. S., Rosenberg, W., & Haynes, R. B. (2000). Evidence-based medicine: How to practice and teach EBM (2nd ed.). New York: Churchill Livingstone.
  • Substance Abuse and Mental Health Services Administration. (2007). National Registry of Evidence-Based Programs and Practices. Retrieved April 29, 2007, from
  • Task Force on Promotion and Dissemination of Psychological Procedures (1995). Training in and dissemination of empirically-validated psychological treatments. The Clinical Psychologist, 48, 3–23.
  • Taylor, C., & White, S. (2002). What works about what works? Fashion, fad, and EBP. Social Work and Social Sciences Review, 10, 63–81.
  • Thyer, B. A. (2004). What is evidence-based practice? Brief Treatment and Crisis Intervention, 4, 167–176.
  • Vaughn, M. G., & Howard, M. O. (2004). Integrated psychosocial and opioid-antagonist treatment for alcohol dependence: A systematic review of controlled evaluations. Social Work Research, 28, 41–55.
  • Webb, S. (2001). Some considerations on the validity of evidence-based practice in social work. British Journal of Social Work, 31, 57–79.
  • Wennberg, J. E. (2002). Unwanted variations in healthcare delivery: Implications for academic medical centers. British Medical Journal, 325, 961–964.
  • Woody, J. D., D'Souza, H. J., & Dartman, R. (2006). Do Master's in social work programs teach empirically-supported interventions? Research in Social Work Practice, 16, 469–479.