Evidence-Informed Social Work Practice
Abstract and Keywords
Evidence-informed practice (EIP) is a model that incorporates best available research evidence; client’s needs, values, and preferences; practitioner wisdom; and theory into the clinical decision-making process filtered through the lens of client, agency, and community culture. The purpose of this article is to define and describe the evidence-informed practice model within social work and to explore the evolution of evidence-informed practice over time. The article distinguishes evidence-informed practice from the more commonly known (and perhaps more popular) evidence-based practice. And, having outlined the essential components of evidence-informed practice, describes the barriers to its effective implementation. Critical contextual factors related to the implementation of evidence-informed practice at the individual level, as well as within social work organizations, are also addressed. Finally, implications both for social work practice and education are explored.
Since the late 20th century there has been a move within social work and other helping professions toward utilizing empirically derived evidence as a foundation for practice. The expectation from funders, policy makers, and social work academics that practitioners will use the best available evidence to inform their practice has steadily intensified both within the United States and globally, especially in the United Kingdom and Australia (Arney, Bromfield, Lewig, & Holzer, 2009; McEwan, Crawshaw, Liversedge, & Bradley, 2008). Several approaches for the conscious utilization of evidence (empirically grounded findings on problems, treatment techniques, and outcomes) have emerged, including empirical clinical practice, evidence-based practice, evidence-guided practice, and evidence-informed practice. There has been considerable debate about what can and should count as evidence, the role that evidence should play in defining practice, who or what should be the source of the evidence, and how that evidence then informs and impacts practice. This article focuses on evidence-informed practice (EIP), which is one prominent model for integrating evidence into social work practice.
Evidence-Informed Practice Defined
Several definitions of evidence-informed practice have been offered. For example, McBeath and Austin (2015) suggest that the “evidence-informed practice model encourages practitioners to draw on and integrate various streams of knowledge into individual decision-making, including service user preferences, clinician experience and practice wisdom, and the best available scientific evidence” (p. 2). Similarly, Haight (2010) proposes that evidence-informed practice occurs when “evidence from empirical research is integrated with professional understanding of sociocultural context—including clients’ beliefs, values, and behaviors—as a guide to intervention” (p.101). Shlonsky, Noonan, Littell, and Montgomery (2011) also emphasize the importance of context, noting that evidence-informed practice “asks practitioners and policy makers to integrate current best evidence with client context in order to provide meaningful and potentially effective services across a range of presenting problems” (p. 362). In their definition, Dill and Shera (2009) acknowledge the contribution of practitioner wisdom within the model, stating that evidence-informed practice represents the “integration of the best available research combined with real-world or tacit knowledge” (p. 156). In this way it is the combination of evidence, knowledge, and context that is important within evidence-informed practice.
Taken together, the characteristics and definitions illustrate that evidence-informed practice includes different combinations of attention to empirical evidence; theory; practice wisdom/tacit knowledge; clients’ values, preferences, and voice; and assessment of key contexts (Dodd & Savage, 2014) (see Figure 1).
It is important to note that in addition to the evidence-informed practice model, the noun, some refer to evidence-informed practices as a plural noun. Proctor (2007) made the distinction for EBP that there is the verb evidence-based practice, the model, and the noun evidence-based practice or practices, which is what others have called evidence supported interventions, or empirically supported interventions (Thyer & Pignotti, 2011). Evidence supported interventions are those that have been demonstrated to be effective preferably via randomized clinical trials or systematic reviews. Similarly, evidence-informed practice is used to refer to intervention models, which have been assessed using at least some systematic empirical inquiry and have shown positive results. Unlike evidence-based practice for which there are specifically designed criteria for evidence (e.g., at least 1 randomized clinical trial test yielding favorable results), the specific criteria for EIP are not specifically prescribed.
Origins of Evidence-Informed Practice
The importance of the role of research and evidence in practice is not new; in fact, it has been recognized for nearly a century. As Reid (1994) notes, Mary Richmond identified the science base of practice as a building block of the profession in her 1917 classic Social Diagnosis. And then, starting in the 1960s an emphasis on empirical practice developed in the form of empirical clinical practice and research-based practice (Epstein, 2009; Okpych & Yu, 2014). According to Reid (1994), empirical clinical practice had three identifiable facets, facets that have endured in our latest understanding of evidence-informed practice: (1) using research methods and tools in assessment, intervention choice, and outcome evaluation; (2) utilizing interventions that have research support; and (3) building knowledge through dissemination of studies done by practitioners. Starting in 1984 the Council on Social Work Education (CSWE) supported the importance of research in practice and included research content in curriculum requirements (Reid, 1994). The Council on Social Work Education’s 2008 Educational Policy and Assessment Standards (EPAS) included emphasis on research-informed practice and practice-informed research (CSWE, 2008). The 2015 EPAS, includes research consumption as an explicit practice behavior under research. Significantly, the competency on practice intervention makes explicit recognition of the role of evidence-informed practice, requiring that “social workers are knowledgeable about evidence-informed interventions to achieve the goals of clients and constituencies, including individuals, families, groups, organizations, and communities . . . Social workers understand methods of identifying, analyzing and implementing evidence-informed interventions to achieve client and constituency goals” (CSWE, 2015, p. 9).
The more recent movement to include evidence as an essential and central foundation for practice developed in the mid 1990s within medicine as a way to “bring research findings to medical practice decisions” (Jenson & Howard, 2013, Definitions and Evolution of EBP, paragraph 1). Most often referred to as evidence-based medicine and alternately as evidence-informed medicine (Chalmers, 2005), its principles and language were then embraced in other helping professions, including education, nursing, and public health.
In their initial discussion of the inclusion of evidence in medicine, Sackett, Rosenberg, Gray, Haynes, and Richardson (1996) defined evidence-based practice as the “conscientious, explicit, and judicious, use of current best evidence in making decisions about care” (p. 71). This definition was later expanded to include three dimensions, namely best practice evidence, clinical expertise, and client needs or values. Simultaneously, within social work a critique was emerging that called for social workers to rely on more than just practice theory and “gut reactions” to drive interventions with clients. The call was for practice theories and practitioner wisdom to be supplemented, if not supplanted, by empirically generated information regarding which interventions worked, for whom, and in what circumstances (Gambrill, 1999, 2001, 2003). This call emerged in the context of a growth of social work and other behavioral research, as well as financial pressure to increase the predictable outcomes of social work intervention while also increasing efficiency (Arney, Bromfield, Lewig, & Holzer, 2009). The logic of evidence-based medicine was subsequently embraced and applied to social work in the form of evidence-based practice by academics such as Gambrill (1999), Gibbs (2003), and Gray, Plath, and Webb (2009).
During this time, evidence-based practice and evidence-informed practice were employed as different terms for essentially the same concepts (see for example Gambrill, 2008, 2010), but then were subsequently defined quite differently (Epstein, 2009; Haight, 2010; Shlonsky & Mildon, 2014). In both models there is a very strong emphasis on evidence, that is, research findings, as a driver of intervention choice. However, the evidence-informed practice model and those who apply it acknowledge that evidence alone is not sufficient to drive and improve practice or policy (Chalmers, 2005).
Distinguishing Characteristics of Evidence-Informed Practice
Evidence-informed practice is a dynamic practice model that places practitioner decision-making at the center of social work practice but that expects the practitioner to consider a range of empirical and other evidence and knowledge when determining the best possible treatment intervention for each client. Those who promote evidence-informed practice distinguish it from other research and evidence-based models by emphasizing that utilization of the term “informed” allows evidence to serve as one dimension of the decision-making process but not as the only factor and not necessarily as the most heavily weighted factor. So, for example, Eccles (2009) suggests that using “informed” when describing the role of evidence in practice “deliberately places human agency at the heart of decision making” (p. 193). Reflecting the more inclusive nature of both evidence and the decision-making process within evidence-informed practice, she reports preferring the term “informed” because when applied to practice “research is a key but not sole determinant of decision making and evidence is inclusive of practice wisdom and service user reviews” (p. 193). Echoing this sentiment, Shlonsky, Noonan, Littell, and Montgomery (2011) suggest that within the evidence-informed practice model, use of the term “evidence- informed better conveys that decisions are guided or informed by evidence rather than based solely upon it, and that clients themselves are informed consumers of services” (p. 363). Similarly, Epstein (2009) suggests that evidence plays an important but not solitary role in clinical decision-making, arguing “practice knowledge and intervention decisions might be enriched by prior research but [are] not limited to it” (p. 224).
Another distinguishing characteristic of evidence-informed practice involves the types of evidence that it embraces. Evidence-informed practice acknowledges the potential contribution of a range of types of evidence and does not privilege one type of evidence or knowledge over others. This contrasts with evidence-based practice (EBP) in which the emphasis is on evidence gleaned from carefully constructed experimental and quasi experimental designs and systematic reviews. In detailing the proceedings of “Beyond the Rhetoric: International Perspectives on Evidence-Informed Practice,” Petch (2009) reflected this embrace of a range of types of evidence, describing the three agreed components of evidence-informed practice as (1) methodological pluralism, (2) other forms of knowledge, and (3) knowledge exchange. Epstein (2009) also articulated principles behind evidence-informed practice. Reflecting some of the same principles as Petch (2009), he included that evidence-informed practice is “methodologically and culturally pluralist; honors practice-wisdom as well as research-based knowledge; supports practitioner-researchers and academic researchers alike; [includes a] full continuum of collaborative relationships; promotes practice-driven research and research-driven practice; and empowers practitioners as co-creators of social work knowledge” (Epstein, 2009, p. 225).
An additional distinguishing characteristic is that rather than a fixed set of linear steps (as is the case with EBP), evidence-informed practice is an iterative ongoing practice process accessing and employing evidence and knowledge within the context of a developing therapeutic relationship. As treatment progresses the evidence-informed practice process allows for continual adjustment of the treatment approach to emerging information.
What Counts as Evidence in Evidence-Informed Practice?
Critically important in defining evidence-informed practice as a model for practice is an understanding of what actually constitutes evidence within the model. Those who are strong proponents of the evidence-based practice (EBP) model emphasize the use of a hierarchy of evidence (think Maslow’s hierarchy of needs) that has as its pinnacle of evidence or “gold standard” the randomized control trial (RCT), and descends through quasi-experimental studies, then correlational studies, qualitative studies, and has as its base case studies (Roberts & Yeager, 2004).
Proponents of an evidence-informed approach have argued that viewing RCTs as the gold standard for evidence is both impractical and inappropriate given that RCTs are still relatively uncommon within social science (Austin, 2008) and that often clinical trials are conducted under highly controlled circumstances and implementation in the real world of practice thus requires adaptation and may not yield the expected outcomes (Aisenberg, 2008; Betts Adams, Matto, & LeCroy, 2009; Mullen, 2014). Others have argued that RCTs are unethical in some circumstances (Dodd & Epstein, 2012; Nevo & Slonim-Nevo, 2011). Further, while intervention experiments may be appropriate for understanding the impact of interventions, they are not ideal for other purposes, such as understanding prevalence, etiology, approaches to engagement, and assessment (Shlonsky & Mildon, 2014).
In contrast to EBP, Epstein (2009) argues for a more inclusive, non-hierarchical framing of evidence within evidence-informed practice, proposing a wheel of evidence, which values all practice research for its contribution to knowledge (Epstein, 2009). Utilizing a wheel of evidence signifies valuing the contribution of the different methodological approaches of randomized control trials, case studies, qualitative studies, quasi-experiments, and correlational studies, which all appear on the wheel and can be selected for their particular contribution to practice knowledge, recognizing both their strengths and limitations. Presenting the different types of studies in a circular rather than linear way, more typical in research textbooks, signifies that no one methodology is privileged over the others, recognizing the differing but valuable contributions of each type of evidence. In fact, Haight (2010) has argued that within the evidence-informed practice tradition the gold standard may be “studies that provide a richly contextualized analysis of social phenomena through practices such as sustained engagement and use of multiple methods—including direct observations, in-depth interviewing, and record reviews” (p. 102).
When defining evidence within evidence-informed practice it is also important to distinguish not just between the different sources of evidence but also the different purposes of the inquiry and subsequent evidence. Studies of the effectiveness of practice interventions have garnered the most attention. yet studies of etiology, prevalence, approaches to engagement, assessment, and retention techniques can be of great value for practice. Evidence related to descriptive questions, such as the prevalence of certain health and mental health issues, both in general and in particular populations; evidence in relation to risk or prognosis of particular conditions; evidence related to typical client career and patterns of progress; and important assessment information (Shlonsky & Mildon, 2014) are all potentially valuable types of evidence that can influence both practice and policy, and, therefore, factor into the evidence-informed practice process.
Just as the original models of evidence-based medicine assumed the central importance of physician judgment, evidence-informed practice proponents place great value on practitioners’ “wisdom” (Austin, Dal Santo, & Lee, 2012), identifying clinical judgment as a key element in the evidence-informed practice model (Austin et al., 2012; Chalmers, 2005). The key components of practitioner wisdom include tacit knowledge, critical reflection, and intersubjectivity, which is knowledge derived in the course of the practice interchange. Practitioner wisdom is included in the evidence-informed practice model even though part of the movement toward evidence within social work was because of skepticism and even overt hostility toward the use of practice wisdom as the sole guiding force of practice decision-making (Gambrill, 1999). This stems, in part, from the fact that it is hard to operationalize and even harder to measure. This difficulty is acknowledged by Austin (2008), who defines tacit knowledge, one key aspect of practitioner wisdom, as “knowledge that exists in the minds of workforce members, manifests itself through their actions, and is not easily articulated” but argues that nevertheless “tacit knowledge is a meaningful and important source of information that influences the decisions and actions of practitioners” (Austin, 2008, p. 580). In this way, tacit knowledge is felt more than known, and yet when making practice decisions social workers know that their practice intuition and prior experiences influence their choices. Evidence-informed practice honors this process and makes space for tacit knowledge within the model (see Figure 1).
The dimension of practitioner wisdom that encompasses critical reflection stems from Schon’s (1995) formulation of professional practice that involves a technical rationality mode (akin to early formulations of evidence-based practice) and a “reflecting-in-action” mode (p.54) where “you are noticing, at the very least, that you have been doing something right, and your ‘feeling’ allows you to do that something again” (p. 55). The process of critical reflection is iterative in nature and involves analysis and reanalysis of key moments (Dallos & Stedmon, 2009). Fook and Gardner (2013) capture the critical thinking or internal dialogue aspect of reflection as “the process of questioning the foundations of beliefs with a preparedness to change them in light of that questioning” (Fook & Gardner, 2013, p. 3). Being open to change based on your assessment is a critical aspect of the technique. Schon (1995) points out that professionals of necessity focus on the mismatch of patterns of practice and knowledge as part of this process in which “complexity, uncertainty, instability, uniqueness and value conflict are present” (p. 18). The idea is that the process of critical reflection and review allows practitioners to gain distance from their experience and achieve a degree of objectivity that contributes to building tacit knowledge and influences the direction of practice (D’Cruz, Gillingham, & Melendez, 2007).
Done systematically and with attention to both individual and organizational learning, critical reflection can contribute to the body of knowledge and evidence for evidence-informed practice. In addition, critical reflection assists practitioners to identify issues for further exploration or intervention. It can also generate practice questions or working hypotheses that can be explored in the practice interchange with the clients or in examination of existing empirical literature.
A further component of practitioner wisdom is the notion of intersubjectivity. As Schon (1995) noted, professionals learn in-action. In addition to evidence that emerges from the reflective process, one important working goal of practice is to achieve a level of attunement to the client’s state and progress known as intersubjectivity. “Intersubjectivity is the mutual knowing between two people of the other’s experience, including meaning, intentions, and emotions (i.e., subjective experiences) based on a shared state (Bråten & Trevarthen, 2007; Trevarthen, 2009a)” (as cited in Arnd-Caddigan, 2011, p. 373). In addition to objectivist evidence, evidence from clinical experience, including intersubjectivity “can guide the therapist in making decisions within the practice, helping to identify the shifting goals in treatment, aid in choosing strategies that contribute to the overall improvement of the client, and document the client’s improvement” (Arnd-Caddigan, 2011, p. 372). Taken together, tacit knowledge, reflection, and intersubjectivity constitute practitioner wisdom that offers another stream of information that influences decision-making within the evidence-informed practice model.
Knowledge Derived from Theory
Within evidence-informed practice in addition to empirical evidence and practitioner wisdom, theory driven conceptualizations of the problem represent another stream of information to be considered by the social worker during the treatment planning and decision-making process. Theories of human development and behavior, theories of causation, change, and transformation are central to the social work knowledge base (Betts, Matto, & LeCroy, 2009). Though theories may not be tested by research, they invariably play a role in framing practice at all levels. In fact, theory both formal and that which emerges from workers’ interactions with their clients commonly underpin tacit knowledge.
Theories undergird the focus and techniques that practitioners use to inquire about clients’ lives and problems. Theoretical conceptualizations may determine the ways in which the social worker frames the question, and the types of evidence and knowledge they seek out to inform their decision-making. Theoretical conceptualizations also serve as a filter through which evidence is appraised and its applicability to each particular client situation is assessed. For example, if one sees the world through an object relations’ theoretical lens, the lines of inquiry during assessment will focus in particular ways on family of origin attachments and childhood patterns and their influence on current functioning. If, by contrast, an ecological perspective dominates, the questions and attention will be differently focused on the current impact of system and environmental factors on functioning. Theoretical conceptualizations are important not just to provide perspectives on the client’s presenting problem, or the available treatment approaches, but also to provide understanding about cultural and contextual factors that may be relevant.
In addition, theories of causation, of behavior, and of change inform the ways agencies are structured, how they embrace an ideology and service philosophy, choose their service array, and allocate resources. Theory related to organizational management and ideology may impact the social worker’s capacity to employ particular interventions or embrace particular goals with a client within the agency. For example, if a homeless person with HIV uses marijuana to self-medicate, then an agency with a harm reduction perspective will be more compatible than an agency that takes an abstinence-only approach.
There is clearly a risk that theory, implicit or unexamined, may lead to ineffective or inappropriate practice. This occurs when theory rather than the client’s reality dominates the practitioner’s perceptions and actions. In such situations the practitioner’s perspectives and agency policies go unexamined for their fit with evidence, client need, or preference. Theory alone or theory unexamined is insufficient for evidence-informed practice. For this reason, critical reflection along with the client’s perspective and empirical evidence are critical to support, balance, or correct theoretical constructs and conceptualizations. Therefore, within evidence-informed practice theory is a necessary but not sufficient ingredient for practice decision-making.
Client’s Voice and Involvement
In health care, where the foundations of evidence-informed practice were laid, physicians do not proceed with a course of treatment without explaining the known risks and benefits to the patient unless the situation is urgent. Too often this is not the case in social work, where practitioners often formulate the treatment plan independently or perhaps with feedback from a supervisor or agency practice constraints. Evidence-informed practice parallels the evidence-based medicine, evidence-based practice model of inclusive consideration of the client as a critical contributor to defining the issues, evidence gathering, and decision-making processes. Its earliest proponents in social work have emphasized the clients’ participation in the decision process (Gambrill, 1999, 2003, 2010).
As can be seen in Figure 1, the client’s voice and appreciation of the unique characteristics, needs, and values of the client as well as the cultural and community context of the client system are critical factors in the evidence-informed practice model. Though it has been noted that incorporation of client values is underdeveloped (Gilgun, 2005), several models for client involvement in decision-making exist. For example, Gambrill (1999) discusses Entwistle et al.’s (1998) approach for including client voice fashioned after the medical model (evidence-informed patient choice):
Evidence-informed patient choice (EIPC) entails three criteria: 1) the decision involves which health care intervention or care pattern a person will or will not receive; 2) the person is given research-based information about effectiveness (likely outcomes, risks, and benefits) of at least two alternatives (which may include the option of doing nothing); and 3) the person provides input into the decision-making process. (Gambrill, 1999, p. 346)
Another similar approach is one in which evidence is gathered and the practitioner presents the client with alternative approaches identified, along with their risks and benefits, so that the client can collaborate with the practitioner as an equal partner in choosing an initial course of action and in shifts in goals and methods during the course of treatment. Alternatively, the client can be involved directly in the process of gathering information, examining alternatives, and collaborating on the selection and planning of the treatment approach and in altering the approach as new information emerges and new goals are identified (Gambrill, 1999, 2003).
In addition to models empowering individual clients, involvement of clients on advisory boards and in evaluation efforts maintains client voice within the practice setting. In fact, in some countries, service user involvement in research, policy, and service design is encouraged. While called for in the earliest evidence-informed practice models, developing thoughtful, empowering approaches to involvement of the service user in their own intervention plans as well as involvement of consumers as a group of stakeholders in the organization warrants greater attention.
Cultural and Contextual Factors
In describing an applied example of evidence-informed practice to maternal and women’s health globally, Snelgrove-Clarke and Rush (2011) point out the importance of context within practice and acknowledge the importance of both cultural context and client voice:
As we have learned from international efforts, success is evident [in transferring evidence into health care practice] only when interventions are based upon the community within which they are going to be applied, that is, they are context specific. Inclusion of the client’s voice, respecting their culture, and applying evidence throughout the continuum of care (Callister & Edwards, 2010) can help. (p. 126)
Consistent with the person in environment perspective central to social work practice, it makes sense to consider the contexts in which the service users and service providers are embedded in order to successfully engage in evidence-informed practice. Shlonsky and Mildon (2014), building on a model first presented by Regehr, Stern, and Shlonsky (2007), note that “individual clinical decisions do not occur in a vacuum. Rather, they occur in a vastly complex social context that is likely to have at least as much to do with overall client outcomes as clinical work” (p. 21). They suggest that practitioners take note of the economic and political context as well as the historical, cultural, and professional and organizational environment, which clearly impact both service providers and those seeking services (Shlonsky & Mildon, 2014).
An obvious consideration in understanding the needs and problems of clients is to understand the cultural and community context in which they are embedded. There has been exploration of the relationship between culturally competent practice and evidence-based practice. Some have noted that the heavy emphasis on evidence gleaned from randomized clinical trials limits the value of so called “evidence-based practices” for diverse cultural groups, making cultural competence and evidence-based practice incompatible (Aisenberg, 2008; Kirmayer, 2012). It’s been noted that clinical trials have rarely explored the suitability of an intervention for non-dominant population groups, nor have culturally adapted versions of tested interventions been widely tested (Aisenberg, 2008; Echo-Hawk, 2011; Whaley & Davis, 2007; Whitley, 2007). Evidence-informed practitioners, with a broader embrace of other research methodologies and multiple sources of knowledge, have the potential to be far more culturally appropriate and can contribute to the development of a larger body of culturally pertinent knowledge. Within the model this will require significant appreciation and exploration of the complexity of the community and cultural environment in which the service user, practitioner, and organization are located.
Organization and Professional Context
Because most social work practice is delivered in the context of organizations, individual practitioners are not free agents. They are both supported and constrained by agency mission, mandates, policies, and resources. Therefore, evidence-informed practice in action is the work of individual practitioners interacting with client systems in the context of a social agency and a community. Individual practitioners seeking to engage a client system must consider what is possible within the framework of their organization. What information should and can be gathered? What practice approaches are allowable and available? How much contact time over how long a period is permissible? What assessment, supports, and ancillary services are available either within the organization or in its service network? Does the organization implement a fully or partially manualized intervention approach that is part of the framework for service delivery for all or part of the organization? Practitioners need to consider the feasibility of their plans within the organization’s working framework.
When clients’ needs are a good fit for a particular manualized intervention, the organization’s use of these evidence-supported intervention packages may help a practitioner in their partnership with those individual clients. Conversely, because clients’ needs are often diverse, some clients’ needs will not be well matched to the intervention package(s) being implemented. The intervention packages that require a worker to use specific techniques or steps may constrain the worker from using other approaches, preventing the implementation of evidence-informed practice optimally matched to the specific client’s needs with the greatest likelihood of success (Jenson & Howard, 2013).
The following section will explore how each of the ingredients of evidence-informed practice are implemented within the steps of the evidence-informed practice process.
Steps in Evidence-Informed Practice
As has been shown, evidence-informed practice is not simply a research or research consumption strategy but is specifically a practice model whereby evidence, which may or may not be in the form of empirical research, is appraised and integrated with theory, practice wisdom, consideration of client values and choices, cultural and contextual factors of the client, community, and agency in order to ensure the best possible intervention is being used with a client. In addition, evidence-informed practice acknowledges that the practice process is not linear and static but rather ongoing, with evidence being reappraised and the intervention(s) adjusted as necessary throughout the treatment process.
Petch (2010) describes and diagrams the evidence-informed practice cycle as (1) determining the issue, (2) accessing and appraising the evidence, (3) identifying practice implications, (4) implementing, and (5) reviewing and formulating (p. 33).
The following sections review each step in the evidence-informed practice model.
Determining the Issue
First, determining the issue is importantly a collaborative process between the social worker, the client, and the practice context. This is part of the assessment process to determine what presenting problem should be addressed and, therefore, what relevant information is required to determine the best course of treatment. The process for determining the issue begins prior to first contact with the client and continues in an ongoing process. Consistent with the nonlinear nature of the evidence-informed practice model, determining the issue is not seen as definitive. It signals the beginning of the definition of the problem process, and consistent with critical reflection is revisited as necessary throughout treatment to ensure that it remains appropriate.
Accessing and Appraising Relevant Evidence
Once the presenting issue has been determined, the next stage is the acquisition and appraisal of relevant evidence. The appraisal stage involves first locating relevant evidence from a range of diverse sources. Information literacy skills are critical in order to facilitate the location of relevant resources from diverse sources. Potential sources of evidence include empirical evidence published in academic journals (including both qualitative, quantitative, and mixed methods studies), case studies, case notes published in professional journals, conference proceedings, evaluation reports, government research reports, and professional clearing house websites. Petch (2010) notes that not only is relevant evidence “the best available research evidence” but that the process involves “attention to user and carer views, experiences and opinions of the direct receipt of care and support services, which may have been formally collected through an evaluation study or may have been expressed more informally in the course of daily practice” (p. 34).
Once potential sources of evidence have been located, appraisal involves evaluating the validity and relevance of the information. For example, for empirically based research studies the methodological rigor of relevant research findings will be reviewed. The evidence has to be reviewed to determine the extent to which the findings reflect sound research principles and are therefore valid. A particularly important step is the careful examination of the target population and sample studied to assess the appropriateness of generalizing from the research to the current practice problem of concern. This process can be helped if the reviewer asks some key questions, such as: Was the methodology appropriate for the question and the state of the research knowledge to that point? How did the practitioner account for potential bias in the case review? And, do these evaluation findings represent a population that has relevance to my client?
Particular attention should be paid to cultural and contextual factors to determine the extent to which the findings from any study with its sample are likely to be culturally and contextually relevant for work with the client or clients of concern. Through this appraisal process the practitioner determines what is the best available evidence relevant to this particular case in this particular context.
Identifying Practice Implications
If the evidence appears to be valid and of potential value, then the next stage of the evidence-informed practice process involves compiling and synthesizing the practice-relevant evidence and identifying the practice implications, then determining possible courses of action that might be appropriate options for the particular client in the particular context. Practice wisdom also informs the decision-making process and provides information regarding potential practice implications of these different treatment options. Included in this step are the service user’s contemplation of the care options and the collaborative choice of a course of action.
During the implementation phase the social worker embarks on the course of action collaboratively chosen with the service user. Implementation may involve careful adherence to a manualized treatment, purposeful adaptation of a research-supported treatment, or may involve implementing a treatment that evolves out of practice wisdom and theory but has minimal related research evidence. During this phase the practitioner must be mindful for the emergence of new information that might alter the chosen course of action.
Review and Formulating
The review and formulating phase is critical to the dynamic nature of the evidence-informed practice process. This phase begins as soon as the intervention phase begins. The appropriateness of the treatment and the progress made by the client is noted and evaluated and adjustments are made as necessary with service user input. Consideration is given not only to whether the chosen intervention is alleviating symptoms but also to whether it appears to be a “good fit” for the client and is appropriate given cultural and contextual factors.
The five steps outlined describe the evidence-informed practice process and reflect the evidence-informed practice principles articulated. However, there are potential challenges to the implementation of evidence-informed practice, which need to be considered when adopting evidence-informed practice as a model within agency practice.
Barriers to Implementation
Many authors have noted that practice in the real world is “messy” (e.g., Gitterman & Knight, 2013; Pollio, 2006). Clients and client systems’ lives are complex and dynamic. Empirically tested interventions or treatments may not work as expected in this “messy” context. This may be especially true for evidence developed in highly controlled randomized clinical trials of efficacy (Gilgun, 2005; Mullen, 2014). One risk in any process of utilizing previously tested approaches is the fittedness of the approach to the identified client and the risk of becoming mechanical in the interaction with the client.
The barriers to implementing evidence-informed practice (or any other evidence-focused model) in agency-based or independent practice are formidable. One might argue that the difficulties in implementing evidence-informed practice with its broader mandate to integrate practice wisdom, theory with critical reflection, and common features of practice are greater than for the somewhat simpler evidence based-practice. It may be particularly challenging for new practitioners to gain sufficient practice wisdom to be able to draw strategically on existing interventions or courses of action, which will be helpful and not detract from the vitality and authenticity of the therapeutic alliance with clients.
Further, the call for evidence-based practice has surfaced a variety of concerns that are pertinent to evidence-informed practice as well: (1) not enough evidence with which to base practice on, (2) lack of access to evidence, (3) the stepwise process outlined was burdensome; practitioners simply did not have the time, (4) the translation of particular practices that were effective had to be context sensitive—not all practices could be extracted from their context or planted in a new one, (5) lack of time and resources. The challenges of implementing evidence-informed practice will be different for practitioners based in organization and those who are in private practice in the community. Agency-based practitioners may have more assistive resources while those in private practice may find they have significant flexibility of action and implementation. Both may have little time for the process of locating, accessing, and appraising relevant evidence.
Promoting Evidence-Informed Practice
Social work researchers, practitioners, and policy makers in many countries have grappled with how to promote implementation of evidence-informed practice, evidence-based practice, and practice-based research. A number of efforts are being undertaken to promote evidence-informed practice in the United States and other countries, and the resources generated are serving to improve the evidence base available for the profession.
Efforts to integrate and optimize evidence-informed practice are complicated by the fact that researchers, practitioners, and policy makers operate in three distinct sectors, each with different cultures that need to be recognized and bridged through increased communication (Arney, Bromfield, Lewig, & Holzer, 2009). Petch (2009) recommends that a broker be used to help bridge this gap and facilitate conversations between practice and policy.
Recognizing this need for active interaction between sectors, collaborations, networks, or partnership centers and resources for accessing and assessing practice have developed in many countries. In the United Kingdom, Research in Practice has local government agencies, national organizations, and international partners and offers a combination of “interconnected work strands including Change Projects, Learning Events, Networking, Publications, Website and Working across Boundaries” (Eccles, 2009, p. 194). Dill and Shera (2009) detail PART—Practice and Research Together based in Ontario, Canada. Other examples include the Norwegian Knowledge Centre for the Health Services and the Joanna Briggs Institute in Australia. Further, Arney, Bromfield, Lewig, and Holzer (2009) analyzed the bridging strategies employed by five different Australian organizations and note that these entities typically provide a website offering free access to some (sometimes all) publications; reviews of evidence, briefing papers, or practitioner focused summaries; and implications for practice. PART, for example, distributes what it calls “PARTicles.” These are “concise literature reviews that provide practitioners with important research findings that can be linked to case decisions” (Dill & Shera, 2009, p. 160). In the United States, government agencies have established registries of evidence-based practices such as SAMHSA’s (Substance Abuse Mental Health Services Administration) NREPP, the National Registry of Evidence based Programs and Practices, which has information on over 350 interventions. In addition to registries, SAMHSA, like many of the collaboratives, offers toolkits for using evidence in specific areas of practice as well as stepwise approaches to implementing evidence-informed practice—see, for example, Institute for Research and Innovation in Social Services Confidence through Evidence Toolkit, SAMHSA’s Evidence-Based Practices KITS. The proliferation of these resources means that simple Internet access can provide nearly anyone with current information for a wide variety of practice issues.
Promoting EIP in Organizations
In a time of high demand on practitioners, with large caseloads and extensive reporting requirements, successfully encouraging an evidence-informed practice approach requires an organization to make an affirmative commitment to creating a climate in which reflective evidence-informed practice thrives (Arney, Bromfield, Lewig, & Holzer, 2009; Austin, Dal Santo, & Lee, 2012; Petch, 2009; Petch, 2010) and to support the blossoming of research-minded practitioners who are likely to be adopters of evidence-informed practice (Austin et al., 2012). Austin (2012) notes that “one of the biggest challenges in building knowledge sharing systems is finding the time and resources to foster the interest and commitment required of line workers and practitioners to engage in evidence-informed practice with clients” (p. 2).
We know from a variety of surveys of practitioners and agency heads that despite valuing evidence-supported interventions, practitioners use of research findings or evidence-based practice specifically has been disappointing (Rosen, 2003). A survey of field instructors found that while nearly all agreed or strongly agreed that evidence-based practices were a useful idea, slightly less than 50% indicated that relevant research findings guided their selection of interventions (Edmond, Rochman, Megivern, Howard, & Williams, 2006).
Given the reality that although there is support for evidence-supported practice other considerations dominate practice decisions, work by scholars including Austin, McBeath, Petch, and others (Austin, 2008, 2012; Austin et al., 2012; McBeath & Austin, 2015; Petch, 2009, 2010) has identified actions organizations can take to promote evidence-informed practice. First, active strategic leadership with a vision and commitment to evidence-informed practice is essential. Leaders will need to reinforce the expectation that policy and practice will be evidence-informed and assure that all organization stakeholders have a common understanding of what evidence-informed policy and practice means in their organization. This can be aided by incorporating a specific commitment into the organization’s mission and goal statement as well as procedures manuals, training, and supervisory practice.
Second, maximize support for staff to learn from evidence and share information through inclusion of evidence-informed practice elements in pre- and in-service training, and staff meetings. This can be done through discussion of the evidence foundation for intervention within supervision as well. The creation of and participation in online communities of practice in which staff (possibly from many agencies) are encouraged to share questions, practice problems, and learn of pertinent evidence and solutions to practice challenges is a promising approach. Information sharing can be promoted by using agency intranets, bulletins, and newsletters to identify and disseminate examples of evidence-informed practice pertinent to the work of organization staff. A cadre of evidence-informed champions can be developed to pilot test empirically supported interventions that seem promising for the agency’s clients, and to share the results with colleagues.
Third, agencies can maximize access to research and provide time for practitioners to explore potential evidence. Essential to this process is ensuring that workers have access to computers and the Internet and sufficient information literacy skills to conduct basic and advanced online database searches. Refreshing or reinforcing research comprehension skills may also be necessary. Though these skills should have been acquired in the course of undergraduate BSW or graduate MSW education, it may be necessary to provide search skills and additional methodological training as part of pre- or in-service workshops. If providing time for all social workers is impractical, agencies may consider creating a staff position or contract for the services of an information specialist or information ombudsman to assist with database and Web searches needed by agency personnel on demand.
Organizations may also form strategic alliances with local colleges, universities, or faculty consultants to facilitate access to university resources or identify an individual or group of experts who will serve as “knowledge brokers” (Dill & Shera, 2009, p. 158) who could assist staff in acquiring and appraising empirical and other evidence.
Agencies can also take advantage of free online resources that are especially relevant for the implementation of evidence-informed practice, including the Campbell Collaboration and the Cochrane Collaboration, both of which provide systematic evidence reviews summarizing key research findings by topic area. Additionally, in the United States and the United Kingdom, there are a variety of organizations that facilitate access to evidence, for example, evidence-based practice registries such as Social Work Policy Institute; Research in Practice; Research in Practice for Adults; and SAMHSA’s NREPP, the National Registry of Evidence based Programs and Practices; in addition, the Institute for Research and Innovation in Social Services’ Confidence through Evidence provides a self-instructional toolkit for evidence-informed practice using a four-step model—Acquire, Assess, Adapt, Apply—and also has other free resources.
Evidence-Informed Practice in Social Work Education
The challenge of preparing students to implement the evidence-informed practice model is formidable. The knowledge and skill repertoire required for effective evidence-informed practice is both extensive and varied, from critical reflection to appraisal of the applicability of empirical research. Some facets of evidence-informed practice have long been components of social work education, for example, theory, critical reflection, attunement, attention to context and culture. Other facets may require revisiting, for example, the role of the relationship between practitioner and empowered service user. Successfully preparing students for evidence-informed practice will depend on infusing aspects of evidence-informed practice in both practice and theory courses; helping students to integrate practice and research learning in courses as well as their field practicum, while being mindful of the staging of learning across both the foundation and advanced year; and optimizing the field practicum experience so that students are able to both gain the tacit knowledge and practice wisdom critical in field learning and employ other aspects of the evidence-informed practice approach in their learning and work with clients.
According to the Council on Social Work Education (CSWE), field instruction is the “signature pedagogy” of the profession (CSWE, 2008, p. 8). Field instruction typically uses an apprenticeship model with students watching and working under the guidance of a field instructor in an effort to build practice wisdom (tacit knowledge and critical reflection) and to integrate the theoretical and empirical material learned in classes with their real-world practice experience. The learning in such apprenticeships is dependent on the resources of both the setting and the field instructor. It is likely that students consolidate their practice wisdom through the field practicum but are less likely to have extensive practice in implementing evidence-supported practices. The use of evidence-supported intervention or evidence-based practice (EBP) approaches is viewed favorably in the field, but the actual utilization of such approaches is notably lower than their approval and interest (Edmond, Rochman, Megivern, Howard, & Williams, 2006; Parrish & Rubin, 2012). This makes it likely that though students will have direct opportunity to develop tacit knowledge, critical reflection, and attunement, they may need further opportunities for learning the application of empirically supported interventions and approaches.
The limited success that EBP advocates have had in transforming the social work curriculum is a cautionary tale for advocates of evidence-informed practice. Despite wide agreement among proponents that mastery of this approach required broad-based infusion and reinforcement in a variety of courses, and notable efforts in some schools (e.g., Howard, McMillen, & Pollio, 2003) to promote evidence-based practice, instruction on evidence-based practice resides largely in research courses (Traube, Pohle, & Barley, 2012).
The EBP advocates have focused on educating social work students to become effective consumers of research knowledge and have recommended changes in the substance and focus of required research courses (Howard, Allen-Meares, & Ruffolo, 2007; Yankeelov, Sar, & Antle, 2010). There is a proliferation of research texts that focus on research for evidence-based practice, as well as model syllabi posted by the Council for Social Work Education for a variety of “EBP and EST” courses, nearly all of which are research courses. While such developments may be useful to a curriculum that supports evidence-informed practice, some adjustment and expansion of content that allows for a more pluralistic approach to methodology and evidence may be needed.
Proponents of evidence-based practice have said “the education of social workers will need to be strategically changed so that all aspects of coursework, field practicum and professional development include training in the steps of EBP” (Edmond et al., 2006, p. 380). A similar infusion and integration of evidence-informed practice learning will be important, particularly because of the complexity of integrating the many elements of evidence-informed practice. On the other hand, evidence-informed practice is highly compatible with more traditional views of the art of social work practice as well as with those who seek greater emphasis on the science of practice. Perhaps its place as a synthesis of these streams will facilitate development of supporting curriculum.
Evidence informed practice emerged as one response to the call for the integration of evidence into social work practice decision-making. The model holds promise for uniting the knowledge base and skills of social work practice with the ever-growing body of empirical evidence, while simultaneously honoring the centrality of client preference and voice.
Aisenberg, E. (2008). Evidence-based practice in mental health care to ethnic minority communities: Has its practice fallen short of its evidence? Social Work, 53, 297–306.Find this resource:
Arnd-Caddigan, M. (2011). Toward a broader definition of evidence-informed practice: Intersubjective evidence. Families in Society: The Journal of Contemporary Social Services, 92, 372–376.Find this resource:
Arney, F. M., Bromfield, L. M., Lewig, K., & Holzer, P. (2009). Integrating strategies for delivering evidence-informed practice. Evidence & Policy: A Journal of Research, Debate & Practice, 5, 179–191.Find this resource:
Austin, M. J. (2008). Strategies for transforming human service organizations into learning organizations: Knowledge management and the transfer of learning. Journal of Evidence-Based Social Work, 5, 569–596.Find this resource:
Austin, M. J. (2012). Introduction. Journal of Evidence-Based Social Work, 9, 1–2.Find this resource:
Austin, M. J., Dal Santo, T. S., & Lee, C. (2012). Building organizational supports for research-minded practitioners. Journal of Evidence-Based Social Work, 9, 174–211.Find this resource:
Bertram, R. M., Charnin, L. A., Kerns, S. E. U., Long, A. C. J. (2015). Evidence-based practices in North American MSW curricula. Research on Social Work Practice, 25(6), 737–748.Find this resource:
Betts Adams, K., Matto, H. C., LeCroy, C. W. (2009). Limitations of evidence-based practice for social work education: Unpacking the complexity. Journal of Social Work Education, 45, 165–186.Find this resource:
Callister, L., & Edwards, J. (2010). Achieving millennium development goal 5, the improvement of maternal health. Journal of Obstetric, Gynecological, and Neonatal Nursing, 39, 590–599.Find this resource:
Chalmers, I. (2005). If evidence-informed policy works in practice, does it matter if it doesn’t work in theory? Evidence and Policy: A Journal of Research, Debate and Practice, 1, 227–242.Find this resource:
Council on Social Work Education. (2008). Educational policy and accreditation standards.Find this resource:
Council on Social Work Education. (2015). Educational policy and accreditation standards.Find this resource:
Council on Social Work Education. (n.d.). Model EBP and EST syllabi.Find this resource:
D’Cruz, H., Gillingham, O., & Melendez, S. (2007). Reflexivity, its meanings and relevance for social work: A critical review of the literature. British Journal of Social Work, 37, 73–90.Find this resource:
Dallos, R., & Stedmon, J. (2009). Flying over the swampy lowlands: Reflective and reflexive practice. In J. Stedmon & R. Dallos (Eds.), Reflective practice in psychotherapy and counseling (pp. 1–22). Maidenhead, UK: McGraw Hill Open University Press.Find this resource:
Dill, K., & Shera, W. (2009). Designing for success: The development of a child welfare research utilisation initiative. Evidence & Policy: A Journal of Research, Debate & Practice, 5, 155–166.Find this resource:
Dodd, S. J., & Epstein, I. (2012). Practice-based research in social work: A guide for reluctant researchers. New York: Routledge.Find this resource:
Dodd, S. J., & Savage, A. (2014, October). Embedding evidence informed practice into social work practice: Curricula and agency challenges. Poster presented at the Council on Social Work Education Annual Program Meeting, Tampa, FL.Find this resource:
Eccles, C. (2009). Effective delivery of evidence-informed practice. Evidence & Policy: A Journal of Research, Debate & Practice, 5, 193–206.Find this resource:
Echo-Hawk, H. (2011). Indigenous communities and evidence building. Journal of Psychoactive Drugs, 43, 269–275.Find this resource:
Edmond, T., Rochman, E., Megivern, D., Howard, M., & Williams, C. (2006). Integrating evidence-based practice and social work field education. Journal of Social Work Education, 42, 377–396.Find this resource:
Epstein, I. (2009). Promoting harmony where there is commonly conflict: Evidence-informed practice as an integrative strategy. Social Work in Health Care, 48, 216–231.Find this resource:
Fook, J. and Gardner, F. (Eds.). (2013). Critical reflection in context: Applications in health and social care. Abingdon, UK: Routledge.Find this resource:
Gambrill, E. (1999) Evidence-based practice: An alternative to authority-based practice. Families in Society: Journal of Contemporary Human Services, 80, 341–350.Find this resource:
Gambrill, E. (2001). Social work: An authority-based profession. Research on Social Work Practice, 11, 166–175.Find this resource:
Gambrill, E. (2003). Evidence-based practice: Sea change or the emperor’s new clothes? Journal of Social Work Education, 39, 3–23.Find this resource:
Gambrill, E. (2008). Evidence-informed practice. In W. Rowe, L. A. Rapp-Paglicci, K. M. Sowers, & C. N. Dulmus (Eds.), Handbook of social work and social welfare (pp. 3–28). Hoboken, NJ: John Wiley.Find this resource:
Gambrill, E. (2010). Evidence-informed practice: Antidote to propaganda in the helping professions? Research on Social Work Practice, 20, 302–320.Find this resource:
Gibbs, L. E. (2003). Evidence-based practice for the helping professions. New York: Wadsworth.Find this resource:
Gilgun, J. F. (2005). The four cornerstones of evidence-based practice in social work. Research on Social Work Practice, 15, 52–61.Find this resource:
Gitterman, A., & Knight, C. (2013). Evidence-guided practice: Integrating the science and art of social work. Families in Society: The Journal of Contemporary Social Services, 94, 70–78.Find this resource:
Gray, M., Plath, D., & Webb, S. A. (2009). Evidence-based social work a critical stance. New York: Routledge.Find this resource:
Haight, W. L. (2010). The multiple roles of applied social science research in evidence-informed practice. Social Work, 55, 101–103.Find this resource:
Howard, M., Allen‑Mears, P., & Ruffolo, M. C. (2007). Teaching evidence-based practice: Strategic and pedagogical recommendations for schools of social work. Research on Social Work Practice, 17, 561–568.Find this resource:
Howard, M. O., McMillen, C. J., & Pollio, D. E. (2003). Teaching evidence-based practice: Toward a new paradigm for social work education. Research on Social Work Practice, 13, 234–259.Find this resource:
Jenson, J. M., & Howard, M. O. (2013). Evidence-based practice. In C. Franklin (Ed.), Encyclopedia of social work.Find this resource:
Kirmayer, L. J. (2012). Cultural competence and evidence-based practice in mental health: Epistemic communities and the politics of pluralism. Social Science & Medicine, 75, 249–256.Find this resource:
McBeath, B., & Austin, M. J. (2015). The organizational context of research-minded practitioners: Challenges and opportunities. Research on Social Work Practice, 25(4), 446–459.Find this resource:
McEwen, J., Crawshaw, M., Liversedge, A., & Bradley, G. (2008). Promoting change through research and evidence-informed practice: A knowledge transfer partnership project between a university and a local authority. Evidence & Policy: A Journal of Research, Debate & Practice, 4, 391–403.Find this resource:
Mullen, E. J. (2014). Evidence-based knowledge in the context of social practice. Scandinavian Journal of Public Health, 42, 59–73.Find this resource:
Nevo, I., & Slonim-Nevo, V. (2011). The myth of evidence-based practice: Towards evidence-informed practice. British Journal of Social Work, 41, 1176–1197.Find this resource:
Okpych, N. J., & Yu, J. L.‑H. (2014). A historical analysis of evidence-based practice in social work: The unfinished journey toward an empirically grounded profession. Social Service Review, 88, 3–58.Find this resource:
Parrish, D. E., & Rubin, A. (2012). Social workers’ orientations toward the evidence-based practice process: A comparison with psychologists and licensed marriage and family therapists. Social Work, 57, 201–210.Find this resource:
Petch, A. (2009). Guest editorial. Evidence & Policy: A Journal of Research, Debate & Practice, 5, 117–126.Find this resource:
Petch, A. (2010). Swings and roundabouts: From evaluation to evidence-informed practice in social services. Revista De Asistenta Sociala (Social Work Review), 9, 29–39.Find this resource:
Pollio, D. (2006). The art of evidence-based practice. Research on Social Work Practice, 16, 224–232.Find this resource:
Proctor, E. (2007). Implementing evidence-based practice in social work education: Principles, strategies, and partnerships. Research on Social Work Practice, 17, 583–591.Find this resource:
Regehr, C., Stern, S., & Shlonsky, A. (2007). Operationalizing evidence-based practice: The development of an institute for evidence-based social work. Research on Social Work Practice, 17, 408–416.Find this resource:
Reid, W. (1994). The empirical practice movement. Social Service Review, 68, 165–184.Find this resource:
Roberts, A. R., & Yeager, K. R. (Eds.). (2004). Evidence based practice manual: Research and outcome measures in health and human services. New York: Oxford University Press.Find this resource:
Rosen, A. (2003). Evidence-based social work practice: Challenges and promise. Social Work Research, 27, 197–208.Find this resource:
Sackett, D. L., Rosenberg, W. M. C., Gray, J. A., Haynes, R. B., & Richardson, W. S. (1996). Evidence-based medicine: What it is and what it isn’t. British Medical Journal, 312, 71–72.Find this resource:
Schon, D. The reflective practitioner: How professionals think in action. Aldershot, UK: Arena, 1995.Find this resource:
Shlonsky, A., & Mildon, R. (2014). Methodological pluralism in the age of evidence-informed practice and policy. Scandinavian Journal of Public Health, 42(Supplement 13), 18–27.Find this resource:
Shlonsky, A., Noonan, E., Littell, J., & Montgomery, P. (2011). The role of systematic reviews and the Campbell collaboration in the realization of evidence-informed practice. Clinical Social Work Journal, 39, 362–368.Find this resource:
Snelgrove-Clarke, E. E., & Rush, J. (2011). Maternal and women’s health: Evidence-informed practice on a global scale. Worldviews on Evidence-Based Nursing/Sigma Theta Tau International, Honor Society of Nursing, 8, 125–127.Find this resource:
Thyer, B., & Pignotti, M. (2011). Evidence based practices do not exist. Clinical Social Work Journal, 39, 328–333.Find this resource:
Traube, D. E., Pohle, C. E., & Barley, M. (2012). Teaching evidence-based social work in foundation practice courses: Learning from pedagogical choices of allied fields. Journal of Evidence-Based Social Work, 9, 241–259.Find this resource:
Whaley, A. L. & Davis, K. E. (2007). Cultural competence and evidence-based practice in mental health services. American Psychologist, 62, 563–574.Find this resource:
Whitley, R. (2007). Cultural competence, evidence-based medicine, and evidence-based practices. Psychiatric Services, 58, 1588–1590.Find this resource:
Yankeelov, P. A., Sar, B. K., & Antle, B. F. (2010). From “producing” to “consuming” research: Incorporating evidence-based practice into advanced research courses in a master of social work program. Journal of Teaching in Social Work, 30, 367–384.Find this resource:
Dill, K., & Shera, W. (2012). Implementing evidence-informed practice: International perspectives. Toronto: Canadian Scholars’ Press.Find this resource:
Per, C. G. (2009). Multidimensional evidence-based practice. New York: Routledge.Find this resource:
Roberts-DeGennaro, M., & Fogel, S. J. (2011). Using evidence to inform practice for community and organizational change. Chicago, IL: Lyceum Books.Find this resource:
Soydan, H., & Palinkas, L. A. (2014). Evidence-based practice in social work. New York: Routledge.Find this resource: