Download a PDF of this article

Abstract: Field Instructors Extending EBP Learning in Dyads (FIELD) has been crafted in consideration of the social work profession’s need for innovative and collaborative models with field education that further evidence-based practice (EBP) implementation efforts. FIELD is driven by the continuing education interests of field instructors and the availability of local expertise, and it embraces the complementary strengths of students and field instructors. Herein, we provide the background for the development of such a curricula model and delineate model components. FIELD may offer a viable curricula option for synchronizing academic and field efforts toward sustainable social work workforce improvements.

Introduction

Equipping current and future social work practitioners with skills to engage in and deliver evidence-based practice (EBP) can be a significant challenge. One underutilized mechanism of disseminating and implementing EBP in social work is field education: specifically, through field instructors. Field instructors serve as primary mediators of student learning, with the ability to support, extend, or extinguish what students understand about EBP from the classroom. Given the social justice of bringing evidence-based interventions to vulnerable populations common to social work practice, we present a justification for the necessity of innovative curriculum models that consistently include field instructors. Herein, we present the rationale for the timeliness of such models informed by obstacles, opportunities, and varied perspectives on EBP within the social work profession, many of which are tied to the belief that EBP exclusively refers to manualized interventions. To address this state of affairs as well as the confusion about the meaning of EBP, we were compelled to craft a curriculum model for the field instructor and social work student dyad. FIELD capitalizes on current impetuses for EBP while simultaneously educating students and field instructors on both aspects of EBP: a process of decision-making (verb of EBP) and designated, well-specified, empirically supported interventions (noun of EBP). Within this paper, we offer a rationale for FIELD, explicate the model components, and present a preliminary qualitative assessment of the feasibility of the model.

Current State of EBP Implementation in Social Work

Workforce Lacks Capacity to Deliver EBP
In spite of the many EBP interventions available, many practitioners have not developed the capacity to implement innovative technologies and recent evidence-based practices (Fixsen, Naoom, Blasé, Friedman, & Wallace, 2005; Stirman et al., 2012). Training strategies and implementation efforts to advance the use of EBP in the social work workforce have trailed behind the creation of interventions (Beidas, Edmunds, Marcus, & Kendall, 2012; Lyon, Stirman, Kerns, & Bruns, 2011). Research has illuminated the societal costs associated with the inability to implement evidence-based practices in a timely fashion. For example, the astounding delay of an average of 17 or more years for EBPs to infiltrate practice settings has been well documented (Balas & Boren, 2000, 1999; Institutes of Medicine, 2001; New Freedom Commission on Mental Health, 2003). Evidence related to the dissemination of EBPs in social work settings increasingly indicates that few educational programs succeed in equipping students with the necessary skills to effectively deliver EBP (Grimshaw et al. 2001; Hoge, Huey, & O’Connell, 2004), though a competency-based curricula adopting evidence-based practice education methods may systematically improve the workforce (Davis, O’Brien, & Freemantle, 1999; Hoge et al., 2009; Mazmanian & Davis, 2002).

Developing proficiency in the delivery of EBPs requires integrated didactic and experiential training methods, yet well over half of U.S. schools of social work do not offer this approach (Hoge et al., 2009; Mullen, Schlonsky, Bledsoe, & Bellamy, 2005; Weissman, et al., 2006). By design, the majority of social work programs offer incomplete training in EBP (Bledsoe et al., 2007) since they often “outsource” the experiential teaching component to community-based field instructors who may lack familiarity with EBP concepts (Mullen & Bacon, 2004).

Rationale for EBP Implementation

An Ethical Imperative for Social Work to Offer EBP
Social workers are the nation’s main provider of mental health, substance abuse and child welfare services to vulnerable populations (Heisler & Bagalman, 2014; Insel, 2004), and offering services guided by EBP is more likely to demonstrate improved client outcomes (APA/CAPP Task Force on Severe Mental Illness and Severe Emotional Disturbance, 2007; McHugo et al., 2007; Institutes of Medicine, 2001), to meet the changing needs of clients, and to deliver culturally competent care (Whaley & Davis, 2007). Vulnerable populations are entitled to programs, treatments, and interventions deemed best practices (Weissman, 2006), and many in social work consider this capacity to capably deliver services supported by research an ethical imperative (Myers & Thyer, 1997; Rubin, 2014; Thyer, 2014).

EBP Potential to Alleviate Staff Turnover
Staff turnover has long been damaging to the social work workforce, where annual rates often exceed 25 percent (Gallon, Gabriel, & Knudsen, 2003) whereas in child and adolescent services, rates of turnover can surpass 50 percent (Aarons & Sawitzky, 2006; Glisson, Dukes, & Green, 2006; Glisson & James, 2002). There is evidence that EBP implementation provides a protective factor against staff turnover (Aarons, Sommerfeld, Hecht, Silovsky, & Chaffin, 2009). However, the high turnover rates themselves serve as an impediment to efforts at implementing EBPs (Resnick & Rosenheck, 2009; Woltmann et al., 2008). Despite overwhelming evidence to advance an EBP implementation agenda, other challenges interfere.

Unique Challenges to Social Workers’ Preparation to Deliver EBP

Multiple Definitions of EBP
There is a great deal of misunderstanding related to how to teach and discuss EBP (Rubin & Parrish, 2004; Mullen et al., 2005; Upshur & Tracy, 2004), as there are two ways in which EBP is understood. The first is a process of decision making (Mullen & Streiner, 2004; Gibbs & Gambrill, 2002; Sackett et al., 1996) that entails the search for and application of best available evidence to practice delivery. In collaboration with clients, the utility of such evidence is continually evaluated. Several social work programs have adopted this process model to organize course curriculum (Edmond, Mcgivern, Williams, Rochman, & Howard, 2006; Howard, McMillen, & Pollio, 2003; Thyer, 2007). Within this context, EBP is being used as a verb. The second definition refers to the designation of a particular intervention where improved outcomes for a population with specific diagnoses or conditions have empirical support. These designated EBPs have been referred to as evidence supported treatments (EST) or empirically supported interventions (Weissman et al., 2006). Examples include Trauma Focused Cognitive Behavioral Therapy (TF CBT), Dialectical Behavior Therapy (DBT), Motivational Interviewing (MI), and Multisystemic Therapy (MST). EBP is used as a noun in this context.

EBP Ideological Differences in Social Work
Further complicating the picture, vested stakeholders hold diverse positions regarding the cultural impact of evidence-based practice on the profession, clients, and field settings. There are concerns about inadequate discussion and debate on the merits of EBP and worries that the therapeutic alliance may be hindered by the delivery of specified interventions (Adams, Matto, & LeCroy, 2009). Another apprehension is the belief that there is an over-reliance on research guided by positivism (Mullen & Streiner, 2004; Rubin, 2011; Staller, 2006). Clinical judgment is the cornerstone of EBP (McNeill, 2006) as is the social worker’s unique and artful use of self in engaging clients. Client voices are central to the process of EBP decision making. These numerous misconceptions result in the continued dismissal of EBP as a process of decision making and, therefore, undercut the possibility of implementing an empirically supported intervention (EBP as a noun). Both types of EBP are designed to avoid flaws in decision-making and the promotion of ineffective services (Gambrill, 2006; Rubin, 2011).

Although there is now a more conscientious effort to use EBP in diverse social work practice settings, including employment, child welfare, health, juvenile justice, mental health, and substance abuse (Fixsen, Blase, Naoom, & Wallace, 2009), some social workers actively resist the use of EBP (Gibbs, 2003; Nelson, Steele & Mize, 2006). Moreover, even when required to do so, many do not incorporate research evidence into their practice (Bledsoe et al., 2007; Mullen, Bledsoe, & Bellamy 2008). Research indicates studies attend more closely to the way in which empirically supported material is taught in psychotherapy training programs, focusing on psychologists and not social workers (Ravitz & Silver, 2004; Shernoff, Kratochwill, & Stoiber, 2003; Weissman et al., 2006).

An Opportune Time to Advance EBP Implementation in Social Work

Institutional Support from the Council on Social Work Education (CSWE)
Historically, social work curriculum design was approached only from a content perspective. Over the years, to meet CSWE standards, new content was incrementally incorporated, leaving less room for practical application. However, according to the recent Educational Policy and Accreditation Standards (EPAS) instituted by the CSWE (2008), accreditation standards no longer mandate academic content, but rather, introduce the notion of requisite student competencies as the organizing principle for curriculum design (Holloway, Black, Hoffman, & Pierce, 2009). Competency-building necessarily relies on field instructors, as they are implementing practical elements of the curriculum.

Timing Is Everything
Current policy and advances in technology create a climate ripe for uptake of EBP in social work. Despite the challenges with integrating science and social work, managed care organizations are demanding that EBPs be employed for reimbursement. In addition, the Patient Protection and Affordable Care Act offers an historic opportunity to expand social work services for previously uninsured Americans (Alegria et al., 2012). There is also unprecedented access to electronic bibliographic databases (Howard et al., 2003; Soyden, 2007), systematic reviews, and meta-analyses (Littell, Corcoran, & Pillai, 2008). Finally, modeled after the Cochrane Library [Cochrane Database of Systematic Reviews (CDSR)], there are now clearinghouses focused specifically on evidence-based social work (for example, the California Evidence-Based Clearinghouse for Child Welfare) and related intervention outcomes (Soyden, Mullen, Alexandra, Rehnman, & Li, 2010) that make this an even more propitious time to move forward on this EBP training agenda for social workers.

Field Instructors: Lynchpins to Training Social Workers to Deliver EBP

Field education is understudied (Kurzman, 2011; Lager & Robbins, 2004) despite research that has suggested that the field instructor is far more influential in a students’ learning and their development of professional social work identities when compared to the influence of classroom instructors (Bogo, Regehr, Hughes, Power, & Globerman, 2004, p. 417; Lager & Robbins, 2004).  Consistent calls to examine the integration of field practice and academic curriculum (Council on Social Work Education, 2008; Miller, 2013; Tuchman & Lalane, 2011) have been made over decades (Carey & McCardle, 2011). Suggested research on enhancing classroom/field integration have included training field instructors to transition from practitioner to teacher (Knight, 2000) and empirically examining the impact of training to establish best practices (Herschell, Kolko, Baumann, & Davis, 2010). However, the pleas are for bold systemic innovations rather than piecemeal approaches (Wayne, Bogo, & Raskin, 2010) toward developing highly qualified social work practitioners.

Wayne, Bogo, and Rastin (2010) discuss the irony of the centrality of field education in social work given that the EPAS, outlined by the CSWE, offer no pedagogical principles for field instruction beyond providing 900 hours of field education for master’s students. There are no recommendations related to supervisory structure, format, or learning/teaching processes. Other than requiring a social work degree and two years of post-MSW-level experience, there are great variations in field instructor characteristics from setting to setting and program to program (Wayne, Bogo, & Rastin, 2010). This diversity holds great appeal but lacks consistency in student assignments, as some shadow and observe seasoned practitioners, while others are sent into solo practice experiences from the start (Fortune & Kaye, 2002; Homonoff, 2008; Mumm, 2006; Wayne, Bogo, & Rastin, 2010). Administrators, educators and practitioners alike are indebted to field instructors for their time and talent investments in social work education. However, Wayne, Bogo, and Rastin (2010) rightly express concern about the variability of field instruction experiences.

Special attention needs to be paid to the field instructor/social work supervisee relationship to better bridge the integration of EBP into social work training programs (Tebes et al., 2010). Bledsoe et al. (2007) suggest that further training field instructors could address the persistent concern that less than 40 percent of graduate programs in social work provide training in EBPs. An investigation aimed at increasing knowledge and skills related to EBP among New York’s mental health human services workforce found that advanced-practice social work students were valued for their knowledge about EBP, yet as interns with no status or authority, they could not be the drivers of EBP implementation in agency settings (Stanhope, Tuchman, & Sinclair, 2010).

A Need for Innovative Educational Models Involving Field Instructors
Maximizing the effect of social work education in EBP requires utilizing the arrangement of field instructors and classroom education to better prepare students for the increasingly challenging contexts of agency practice (Mirabito, 2012). Field instructors seem to be willing promoters of EBP implementation, as research indicated that of 230 field instructors, 87 percent believed EBP would be useful in practice (Howard et al., 2003). New models of EBP training need to convey both ways in which EBP is understood, represent more than single-exposure training (Fixsen et al., 2005), and include experiential active learning components, such as behavioral role plays (Beidas & Kendall, 2010; Beidas et al., 2012), that result in improved adherence, competence, and skills. Models must be highly sensitive to issues of feasibility for successful inclusion of field instructors. In particular, time (Edmond et al., 2006) poses the primary obstacle to field instructor involvement. Supervising students is daunting in the current climate, and Reisch and Jarman-Rohde (2000) have asserted that social work students increasingly must arrive at their internships prepared to learn more independently.

Teaching field instructor/student dyads the general concepts of the process of EBP as well as the specific introductory skills of a designated EBP may provide a platform for the field setting to develop greater capacity to reinforce classroom learning of EBPs. Linking the field instructor/social work intern relationship to EBP dissemination and implementation is possible. However, fundamental to this opportunity is the careful choice of method used to teach EBP process while selecting a designated EBP to introduce as an exemplar to the current generation of social work field instructors in tandem with the new generation of social workers. Arrangements for training will be unique to schools, as resources and schedules may vary and as some schools have taken other measures to translate EBP into field practice settings.

Rationale for and Strength of Dyad Design
Woody et al (2006) urge us to remember that field instructors are crucial to involve in curriculum development and training in EBP; and Mirabito (2012) suggests that we need more collaborative partnerships between faculty and field educators to maximize the integrity of the curricula and better prepare students for the increasingly challenging context of agency-based practice. Therefore, for the adoption of EBP to truly take place and gain a foothold in social work education, we posit that the student/field instructor dyad must be involved in being trained together. Further, Howard et al (2007) urge schools of social work to train field instructors in EBP methods and to aid in facilitating their access to electronic databases to enhance the potential for practice informed by research. Often, students have access to these databases while their field instructors do not; nor do they generally have the time and technical support to conduct these searches (see Table 1). The dyad design accounts for these issues of access while making use of students’ technological sophistication. With students conducting the literature searches, field instructors have the opportunity for exposure to current EBP research. Moreover, field instructors utilize their clinical expertise and experience for interpreting and assessing the available research evidence.

Concurrent Teaching of EBP Process and Designated EBPS in Dyad Approach
When conducting trainings on EBP, a complementary strategy to train in both the process of EBP (verb) along with the concepts and skills of a designated EBP (noun) would seem to be most effective (Bellamy, Bledsoe, Mullen, Fang, & Manuel, 2008). This approach can aid in providing clarity regarding the two ways of defining EBP and ultimately reinforcing the active use of electronic literature searches to keep current with research that may be helpful in serving the client population most effectively. Training dyads of field instructors and their social work student interns has the potential to overcome past efforts that lacked systematic coordination to disseminate and implement EBP in the social work profession.

Introducing the FIELD Model

At a large MSW program on the East Coast, we developed the FIELD model to employ the complementary strategy of training in both the process of EBP (verb) along with the concepts and skills of a designated EBP (noun) to provide clarity regarding the two ways EBP is understood. In choosing a designated EBP to incorporate into FIELD, we elected to offer didactic and experiential training on an EBP in which field instructors expressed interest. Engagement in FIELD is not mandatory but optional and driven by the continuing education desires of field instructors and curriculum content/learning interests of students. Both field instructors and students at our school expressed interest in learning more about Motivational Interviewing (MI). Given that MI met model selection criteria for an EBP and interest incentivized participation, we selected MI as our exemplar EBP, which will be described in more detail later. Additionally, since field instructors typically need to obtain continuing education credits, we offered free continuing education credits as an incentive for participating in FIELD.

We explicate FIELD model elements in three tables herein and emphasize that adaptation is unique to social work schools, as resources and previous measures taken to translate EBP into field practice settings vary. FIELD embraces complementary strengths that social work students and their field instructors bring to bear, rather than viewing these attributes as limitations (see Table 1). Students are typically more sophisticated and experienced in searching literature, given access provided by their educational institutions and classroom expectations to do so; further, if students were born in the 1980s or later, they are more likely to be digital natives. Likewise, field instructors, as practice experts, have developed more artful means with which to engage communities and client populations and can devote their limited time to assessing and applying the evidence to the “real world” of the practice setting. FIELD has field instructors relying on students to “bring the research” by searching the literature on the selected EBP as applied to their client population/agency setting, while field instructors interpret research evidence and mentor and guide students on delivering the evidence, thus extending classroom education into practice contexts.

Curriculum design features (see Table 2) for FIELD include general model elements, field instructor model elements, and student model elements. These component parts can serve as a guide for schools to design adapted variants of FIELD unique to the school and community needs.

Criteria for Selection of EBP for FIELD
The selection of an EBP for FIELD is critical. Specific criteria used for choosing an EBP (see Table 3) pertain to field instructors’ interests as consumers of continuing education, as well as interest expressed by the student community. Other criteria relates to the compatibility of the EBP with social work values and the availability of expertise for ongoing consultation. Furthermore, the EBP selected must be applicable to the diverse settings where social workers practice and be relatively uncomplicated to teach and learn in a short time frame, as research has demonstrated that complex EBPs are far less likely to be implemented in practice (Ager, Roahen-Harrison, Toriello, et al., 2011; Amodeo, Lundgren, Cohen, et al., 2011).

Exemplar EBP in FIELD Pilot
Motivational Interviewing (MI) is an EBP applicable in a variety of social work settings where mental health, substance abuse, and child welfare services are provided. MI is increasingly taught in social work programs and found effective with an array of populations social workers commonly encounter in practice. MI can be employed in conjunction with more complex EBPs and delivered within brief case management encounters.

MI was selected because the inherent spirit and techniques are consonant with social work values (Hohman, 2011), and there is a rich body of evidence to support effectiveness with a variety of vulnerable populations served by social workers (Lundahl, Kunz, Brownell, Tollefson, & Burke, 2010). MI is a client-centered approach that aims to facilitate exploration and resolution of ambivalence (Miller & Rollnick, 2004) related to change in health, mental health, substance use, and child welfare risk behaviors. MI is closely aligned with the harm reduction model (van Wormer, 2007). Originally created by William Miller, MI has strong intuitive appeal, and its basic formulation parallels the strengths perspective of social work practice (van Wormer & Davis, 2002).

Feasibility of FIELD
We conducted a preliminary implementation of the FIELD model with 20 student–field instructor dyads participating. Participants were enthusiastic about learning MI techniques but less keen to learn about the process of EBP. Participants were generally appreciative of coaching on the steps of the EBP process, in which the literature was searched for the application of MI to their own field settings. Data from the qualitative responses indicated that all participants believed it was important to use research to inform practice. One field instructor working with foster care youth was helped by her student to locate a study that described the efficacy of a group-delivered MI intervention. She had struggled with birth parents’ substance use in her agency setting and was highly impressed with the discovery of the versatility of MI. Some field instructors who wanted to learn about MI were appreciative that their students would be a part of FIELD. Several expressed that students “could really use it,” implying that FIELD might reinforce their supervisory efforts. Finally, regarding the booster session, which included expert clinical feedback on participants’ fidelity to MI from practice audiotapes along with a written “check in” about the steps of the EBP process, many field instructors and students alike reported that this was the first time they were directly evaluated and provided feedback on a practice technique, which they found immensely helpful.

Conclusion

Schools of social work have an important leadership role when it comes to EBP implementation within the social work workforce. This effort has proved more difficult than researchers previously estimated (Grady, 2010), and new models must reflect the complexity of social work practice while capitalizing on natural partnerships inherent in professional education (Proctor, 2007). One novel approach to this challenge is the FIELD model. FIELD capitalizes on the rich and existing resources provided by the variety of field instructors in social work education and furthers CSWE goals of promoting competence-based education in manageable and sustainable ways. Attending to the current generation of social workers by making strategic educational investments that include field instructors has rich potential toward ongoing service improvements in the social work workforce.


Aarons, G. A., & Sawitsky, A. C. (2006). Organizational culture and climate and mental health provider attitude toward evidence-based practices. Psychological Services, 3, 61–72.

Aarons, G., Sommerfeld, D. H., Hecht, D. B., Silovsky, J. F., & Chaffin, M. J. (2009).  The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology, 77(2), 270–280.

Adams, K. B., Matto, H. C., & LeCroy, C. W. (2009). Limitations of evidence-based practice for social work education: Unpacking the complexity. Journal of Social Work Education, 45(2), 165–186. doi:10.5175/JSWE.2009.200700105

Ager, R., Roahen-Harrison, S., Toriello, P. J., Kissinger, P., Morse, P., Morse, E., & Rice, J. (2011). Predictors of adopting motivational enhancement therapy. Research on Social Work Practice21(1), 65–76. doi:10.1177/1049731509353170

Alegria, M., Lin, J., Chen, C. H., Duan, N., Cook, B., & Meng, X. L. (2012). The impact of insurance coverage in diminishing racial and ethnic disparities in behavioral health services. Health Services Research, 47(3), 1322–1344. doi:10.1111/j.1475-6773.2012.01403.x

Amodeo, M., Lundgren, L., Cohen, A., Rose, D., Chassler, D., Beltrame, C., & D’Ippolito, M. (2011). Barriers to implementing evidence-based practices in addiction treatment programs: Comparing staff reports on motivational interviewing, adolescent community reinforcement approach, assertive community treatment, and cognitive-behavioral therapy. Evaluation and Program Planning34(4), 382–389. doi:10.1016/j.evalprogplan.2011.02.005

APA/CAPP Task Force on Serious Mental Illness and Severe Emotional Disturbance. (2007). Catalogue of clinical training opportunities: Best practices for recovery and improved outcomes for people with serious mental illness. Retrieved from http://www.apa.org/practice/resources/grid/catalog.pdf

Balas, E. & Boren, S. A. (2000). Managing clinical knowledge for healthcare improvements in a timely fashion. In V. Schattauer (Ed.), Yearbook of medical informatics (pp. 65–70). New York, NY: Stuttgart.

Beidas, R. S., Edmunds, J. M., Marcus, S. C., & Kendall, P. C. (2012). Training and consultation to promote implementation of an empirically supported treatment: A randomized trial. Psychiatric Services, 63(7), 660–665.

Beidas, R. S., & Kendall, P. C. (2010). Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice, 17, 1–30.

Bellamy, J. L., Bledsoe, S. E., Mullen, E. J., Fang, L., & Manuel, J. I. (2008). Agency university partnership for evidence-based practice in social work. Journal of Social Work Education44(3), 55–76. doi:10.1093/swr/svt015

Bellamy, J. L., Bledsoe, S. E., & Traube, D. E. (2006). The current state of evidence-based practice in social work. Journal of Evidence-Based Social Work, 3(1), 23–48.

Bledsoe, S. E., Weissman, M. M., Mullen, E. J., Ponniah, K., Gameroff, M. J., Verdeli, H., & Wickramaratne, P. (2007). Empirically supported psychotherapy in social work training programs: Does the definition of evidence matter? Research on Social Work Practice, 17(4), 449–455. doi:10.1177/1049731506299014

Bogo, M., Regehr, C., Power, R., Hughes, J., Woodford, M., & Regehr, G. (2004). Toward new approaches for evaluating student field performance: Tapping the implicit criteria used by experienced field instructors. Journal of Social Work Education, 40(3), 417–426. doi:10.1080/10437797.2004.10672297

Boren, S. A., & Balas, E. A. (1999). Evidence-based quality measurement. Journal of Ambulatory Care Management, 22(3), 17–23.

Carey, M. E., & McCardle, M. (2011). Can an observational field model enhance critical thinking and generalist practice skills? Journal of Social Work Education, 47(2), 357–366. doi:10.5175/JSWE.2011.200900117

Council on Social Work Education. (2008). Educational policy and accreditation standards. Alexandria, VA: Council on Social Work Education.

Davis, D., O’Brien, M. A. T., Freemantle, N., Wolf, F. M., Mazmanian, P., & Taylor-Vaisey, A. (1999). Impact of formal continuing medical education: Do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? Journal of the American Medical Association, 282(9), 867–874.

Edmond, T., Megivern, D., Williams, C., Rochman, E., & Howard, M. (2006). Integrating evidence-based practice and social work field education. Journal of Social Work Education, 42(2), 377–396. doi:10.5175/JSWE.2006.200404115

Fixsen, D. L., Blasé, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 31–40. doi:10.1177/1049731509335549

Fixsen, D. L., Naoom, S. F., Blasé, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network.

Fortune, A., & Kaye, L. (2002). Learning opportunities in field practice: Identifying skills and activities associated with MSW students’ self-evaluation of performance and satisfaction. Clinical Supervisor, 21(1), 5–28. doi:10.1300/J001v21n01_02

Gallon, S. L., Gabriel, R. M., & Knudsen, J. R. W. (2003). The toughest job you’ll ever love: A Pacific Northwest treatment workforce survey. Journal of Substance Abuse Treatment, 24(3), 183–196.

Gambrill, E. (2006). Evidence-based practice and policy: Choices ahead. Research on Social Work Practice, 16(3), 338–357. doi:10.1177/1049731505284205

Gibbs, L. E. (2003). Evidence-based practice for the helping professions: A practical guide with integrated multimedia. Pacific Grove, CA: Thomson Brooks/Cole.

Gibbs, L., & Gambrill, E. (2002). Evidence-based practice: Counterarguments to objections. Research on Social Work Practice, 12(3), 452–476. doi:10.1177/1049731502012003007

Glisson, C., Dukes, D., & Green, P. (2006). The effects of the ARC organizational intervention on caseworker turnover, climate, and culture in children’s service systems. Child Abuse & Neglect, 30, 855–880.

Glisson, C., & James, L. R. (2002). The cross‐level effects of culture and climate in human service teams. Journal of Organizational Behavior, 23, 767–794. doi:10.1002/job.162

Grady, M. (2010). The missing link: The role of social work schools and evidence-based practice. Journal of Evidence-Based Social Work, 7, 400–411. doi:10.1080/15433711003591101

Grimshaw, J. M., Shirran, L., Thomas, R., Mowatt, G., Fraser, C., Bero, L., & O’Brien, M. A. (2001). Changing provider behavior: An overview of systematic reviews of interventions. Medical Care, II2–45.

Heisler, E. J., & Bagalman, E. (2014, January 7). The mental health workforce:  A primer (Congressional Report No.  R43255). Washington, DC: Library of Congress Congressional Research Service. Retrieved from http://fas.org/sgp/crs/misc/R43255.pdf

Herschell, A. D., Kolko, D. J., Baumann, B. L., & Davis, A. C. (2010). The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review, 30(4), 448–466. doi:10.1016/j.cpr.2010.02.005

Hoge, M. A., Huey, L. Y., & O’Connell, M. J. (2004). Best practices in behavioral health workforce education and training. Administration and Policy in Mental Health and Mental Health Services Research, 32, 91–106.

Hoge, M. A., Morris, J. A., Stuart, G. W., Huey, L. Y., Bergeson, S., Flaherty, M. T., . . . Paris, M. (2009). A national action plan for workforce development in behavioral health. Psychiatric Services, 60(7), 883–887. doi:10.1176/appi.ps.60.7.883

Hohman, M. (2011). Motivational interviewing in social work practice. New York, NY: Guilford Press.

Holloway, S., Black, P., Hoffman, K., & Pierce, D. (2009). Some considerations of the import of the 2008 EPAS for curriculum design. Retrieved from http://www.education.uiowa.edu/docs/default-source/crue-publications/MayhewSeifertPascarella_Moral.pdf

Homonoff, E. (2008). The heart of social work: Best practitioners rise to challenges in field instruction. The Clinical Supervisor, 27(2), 135–169.

Howard, M. O., Allen-Meares, P., & Ruffolo, M. C. (2007). Teaching evidence-based practice: Strategic and pedagogical recommendations for schools of social work. Research on Social Work Practice, 17(5), 561–568. doi:10.1177/1049731507300191

Howard, M. O., McMillen, C. J., & Pollio, D. E. (2003). Teaching evidence-based practice: Toward a new paradigm for social work education. Research on Social Work Practice, 13(2), 234–259. doi:10.1177/1049731502250404

Insel, T. R. (2004, January 16). Science to service: Mental health care after the decade of the brain. Presented at 8th Annual Meeting of the Society for Social work and Research, New Orleans, LA.

Institute of Medicine Committee on Quality of Health Care in America. (2001). Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academies Press.

Knight, C. (2000). Engaging the student in the field instruction relationship: BSW and MSW students’ views. Journal of Teaching in Social Work, 20(3-4), 173–201. doi:10.1300/J067v20n03_12

Kurzman, P. A. (2011). Preface to a special issue on field education. Journal of Teaching in Social Work, 31, 237–238. doi:10.1080/08841233.2011.580709

Lager, P. B., & Robbins, V. C. (2004). Field education: Exploring the future, expanding the vision. Journal of Social Work Education, 40(1), 3–11.

Littell, J. H., Corcoran, J., & Pillai, V. K. (2008). Systematic reviews and meta-analysis. New York, NY: Oxford University Press.

Lyon, A. R., Stirman, S. W., Kerns, S. E., & Bruns, E. J. (2011). Developing the mental health workforce: Review and application of training approaches from multiple disciplines. Administration and Policy in Mental Health and Mental Health Services Research, 38(4), 238–253. doi:10.1007/s10488-010-0331-y

Lundahl, B. W., Kunz, C., Brownell, C., Tollefson, D., & Burke, B. L. (2010). A meta-analysis of motivational interviewing: Twenty-five years of empirical studies. Research on Social Work Practice, 20(2), 137–160. doi:10.1177/1049731509347850

Mazmanian, P. E., & Davis, D. A. (2002). Continuing medical education and the physician as a learner: Guide to the evidence. Journal of the American Medical Association, 288(9), 1057–1060. doi:10.1001/jama.288.9.1057

McHugo, G. J., Drake, R. E., Whitley, R., Bond, G. R., Campbell, K., Rapp, C. A., Goldman, H. H., Lutz, W. J., & Finnerty, M. T. (2007). Fidelity outcomes in the national implementing evidence-based practices project. Psychiatric Services, 58(10), 1279–1284.

McNeill, T. (2006). Evidence-based practice in an age of relativism: Toward a model for practice. Social Work51(2), 147–156.

Miller, W. R., & Rollnick, S. (2004). Talking oneself into change: Motivational interviewing, stages of change and therapeutic process. Journal of Cognitive Psychotherapy: An International Quarterly, 18(4), 299–308.

Miller, S. E. (2013). Professional socialization: A bridge between the explicit and implicit curricula. Journal of Social Work Education, 49(3), 368–386. doi:10.1080/10437797.2013.796773

Mirabito, D. M. (2012). Educating a new generation of social workers: Challenges and skills needed for contemporary agency based practice. Clinical Social Work Journal, 40(2), 245–254. doi:10.1007/s10615-011-0378-6

Mullen, E. J., & Bacon, W. (2004). A survey of practitioner adoption and implementation of practice guidelines and evidence-based treatments. In A. R. Roberts & K. R. Yeager (Eds.), Evidence-based practice manual: Research and outcome measures in health and human services (pp. 210–218). New York, NY: Oxford University Press.

Mullen, E. J., Bledsoe, S. E., & Bellamy, J. L. (2008). Implementing evidence-based social work practice. Research on Social Work Practice, 9, 325–338. doi:10.1177/1049731506297827

Mullen, E. J., Shlonsky, A., Bledsoe, S. E., & Bellamy, J. L. (2005). From concept to implementation: Challenges facing evidence-based social work. Evidence & Policy: A Journal of Research, Debate and Practice, (1), 61–84.

Mullen, E. J., & Streiner, D. L. (2004). The evidence for and against evidence-based practice. Brief Treatment and Crisis Intervention, 4(2), 111–121. doi:10.1093/brief-treatment/mhh009

Mumm, A. M. (2006). Teaching social work students practice skills. Journal of Teaching in Social Work, 26(3), 71–89.

Myers, L. L., & Thyer, B. A. (1997). Should social work clients have the right to effective treatment? Social Work, 42(3), 288–298.

Nelson, T. D., Steele, R. G., & Mize, J. A. (2006). Practitioner attitudes toward evidence-based practice: Themes and challenges. Administration and Policy in Mental Health and Mental Health Services Research, 33, 398–409.

U.S. Department of Health and Human Services. (2003). New Freedom Commission on Mental Health. Achieving the promise: Transforming mental health care in America [DHHS Publication No. SMA 03-3832]. Retrieved from http://www.mentalhealthcommission.gov/reports/FinalReport/downloads/downloads.html

Proctor, E. (2007). Implementing evidence-based practice in social work education: Principles, strategies and partnerships. Research on Social Work Practice, 17(5), 583–591. doi:10.1177/1049731507301523

Ravitz, P., & Silver, I. (2004). Advances in psychotherapy education. Canadian Journal of Psychiatry, 49(4), 230–237.

Reisch, M., & Jarman-Rohde, L. (2000). The future of social work in the United States: Implications for field education. Journal of Social Work Education, 36(2) 201–214. doi:10.1080/10437797.2000.10779002

Resnick, S. G., & Rosenheck, R. A. (2009). Scaling up the dissemination of evidence-based mental health practice to large systems and long-term time frames. Psychiatric Services, 60, 682–685. doi:10.1176/appi.ps.60.5.682

Rubin, A. (2011). Teaching EBP in social work: Retrospective and prospective. Journal of Social Work, 11(1), 64–79. doi:10.1177/1468017310381311

Rubin, A. (2014). A half-century of social work research: Advances and new challenges. Advances in Social Work15(1), 182–195.

Rubin, A., & Parrish, D. (2004). Challenges to the future of evidence-based practice in social work education.  Journal of Social Work Education, 33(3), 405–428. doi:10.5175/JSWE.2007.200600612

Sackett, D. L., Rosenberg, W., Gray, J., Haynes, R. B., & Richardson, W. S. (1996). Evidence based medicine: What it is and what it isn’t. British Medical Journal, 312(7023), 71. doi:10.1136/bmj.312.7023.71

Shernoff, E. S., Kratochwill, T. R., & Stoiber, K. C. (2003). Training in evidence-based interventions (EBIs): What are school psychology programs teaching? Journal of School Psychology, 41(6), 467–483.

Soyden, H. (2007). Improving the teaching of evidence-based practice: Challenges and priorities. Research on Social Work Practice, 17, 612–618. doi:10.1177/1049731507300144

Soydan, H., Mullen, E. J., Alexandra, L., Rehnman, J., & Li, Y. P. (2010). Evidence-based clearinghouses in social work. Research on Social Work Practice, 20(6), 690–700. doi:10.1177/1049731510367436

Staller, K. M. (2006). Railroads, runaways, & researchers. Qualitative Inquiry, 12(3), 503–522. doi:10.1177/1077800406286524

Stanhope, V., Tuchman, E., & Sinclair, W. (2011). The implementation of mental health evidence based practices from the educator, clinician and researcher perspective. Clinical Social Work Journal, 39(4), 369–378. doi:10.1007/s10615-010-0309-y

Stirman, S. W., Kimberly, J., Cook, N., Calloway, A., Castro, F., & Charns, M. (2012). The sustainability of new programs and innovations: A review of the empirical literature and recommendations for future research. Implementation Science, 7(12), 1–19. doi:10.1186/1748-5908-7-17

Stuart, G. W., Tondora, J., & Hoge, M. A. (2004). Evidence-based teaching practice: Implications for behavioral health. Administration and Policy in Mental Health and Mental Health Services Research, 32(2), 107–130.

Tebes, J. K., Matlin, S. L., Migdole, S. J., Farkas, M. S., Money, R. W., Shulman, L., & Hoge, M. A. (2010). Providing competency training to clinical supervisors through an interactional supervision approach. Research on Social Work Practice, 21(2,) 190–199. doi:10.1177/1049731510385827

Thyer, B. A. (2014). Preparing current and future practitioners to integrate research in real practice settings. Research on Social Work Practice, doi:10.1177/1049731514538105

Thyer, B. A. (2007). Social work education and clinical learning: Towards evidence-based practice? Clinical Social Work Journal, 35(1), 25–32.

Tuchman, E., & Lalane, M. (2011). Evidence-based practice: Integrating classroom curriculum and field education. Journal of Teaching in Social Work, 31(3), 329–340. doi:10.1080/08841233.2011.580258

Upshur, R. E. G., & Tracy, C. S. (2004). Legitimacy, authority, and hierarchy: Critical challenges for evidence-based medicine. Brief Treatment and Crisis Intervention, 4(3), 197–204.

Van Wormer, K. (2007). Principles of motivational interviewing geared to stages of change: A pedagogical challenge. Journal of Teaching in Social Work27(1–2), 21–35. doi:10.1300/J067v27n01_02

Van Wormer, K., & Davis, D. R. (2002). Addiction treatment: A strengths perspective. Belmont, CA: Wadsworth.

Wayne, J., Bogo, M., & Raskin, M. (2010). Field education as the signature pedagogy of social work education. Journal of Social Work Education, 46(3), 327–339. doi:10.5175/JSWE.2010.200900043

Weissman, M. M., Verdeli, H., Gameroff, M. J., Bledsoe, S. E., Betts, K., Mufson, L., . . . Wickramaratne, P. (2006). National survey of psychotherapy training in psychiatry, psychology, and social work. Archives of General Psychiatry, 63(8), 925–934.

Whaley, A. & Davis, K. (2007). Cultural competence and evidence-based practice in mental health services. American Psychologist, 62(6), 563–574.

Woltmann, E. M., Whitley, R., McHugo, G. J., Brunette, M., Torrey, W. C., . . . Drake, R. E. (2008). The role of staff turnover in the implementation of evidence-based practices in mental health care. Psychiatric Services, 59(7), 732–737. doi:10.1176/appi.ps.59.7.732

Woody, S. R., Weisz, J., & McLean, C. (2005). Empirically-supported treatments: 10 years later. The Clinical Psychologist, 58(4), 5–11.