Download a PDF of this article

Abstract: Although many programs utilize field education outcomes in their overall assessment plan, there are few models for how to use these data for continuous quality improvement, especially when benchmarks have been met. This article presents a model for developing a field-based intervention to improve the incorporation of policy-related content in field. It is grounded in one school’s experience with the 2008 EPAS policy competency, which outcome data showed to be among the lowest competencies over several years in this school’s BASW and MSW programs. Implications for enhancing curriculum content and improving the connection between classroom and field are considered.

Introduction

The Commission on Accreditation (COA) and the Commission on Educational Policy (COEP) of the Council on Social Work Education (CSWE) introduced the 2008 Educational Policy and Accreditation Standards (EPAS) document with the organizing curricular principles of social work education and marked the profession’s transition to competency-based education (CSWE, 2008). Since that time and with parity, social work classroom and field educators have been expected to utilize the ten identified and now well-known professional core competencies to address the needs and evaluate the outcomes of student learning (Lyter, 2012). Assessment is seen as “an integral component of this new competency-based environment” (CSWE, 2008, p. 16).

Additionally in developing EPAS 2008, field education was assigned the role of signature pedagogy “representing the central form of instruction and learning in which a profession socializes its students to perform the role of practitioner” (Shulman, 2005, p. 58). Not surprisingly, assessments in the field education experience are commonly used as one of the two required measures for all practice behaviors in social work programs. Thus, these relatively new developments have moved field education not only into a position of importance in determining individual student competence but also into the heart of the process of program-level competence assessment and curricular revision.

Historically, studies on social work field education have focused on the methodology and pedagogical approaches of field education (Anderson & Harris, 2005; Bogo, Regher, Power, Hughes, Woodford, & Regher, 2004; Fortune, McCarthy & Abramson, 2001; Rocha, 2000; Wayne, Bogo, & Raskin, 2010); the issues and proposed solutions surrounding current field education practices (Berengarten, 1961; Sherraden, Slosar & Sherraden, 2002; Wayne, Bogo, & Raskin, 2006; Wayne et al.,2010); how the relationship between the field instructor and student impacts evaluation (Bogo, Regehr, Hughes, Power, & Globerman, 2002; Bogo et al., 2004; Bogo, Regehr, Woodford, Hughes, Power, & Regehr, 2006; Fortune, McCarthy & Abramson., 2001; Knight, 2001); and the organizational concerns connecting universities and agencies (Berengarten, 1961; Jarman-Rohde, McFall, Kolar, & Strom, 1997; Knight 2001; Wayne et al., 2006). There has been little attention paid to examining specific outcomes, beyond individual student competence, that might lead to a better understanding of student learning needs, address potential curricular revisions, or suggest changes in teaching methods to advance both learning and practice competence.

Competency is defined by Bogo et al (2004) as “a complex set of behaviors that demonstrate possession of knowledge, skills and attitudes held to a standard associated with effective practice” (p. 418). Although in its infancy, assessment of social work competence is required for institutional accreditation, and benchmarks are set by each program for determining adequate performance by students. In field education, if a percentage of students fall below the benchmark, alarm bells should ring and action is required to clarify and fix the problem through curriculum and/or field changes.

If, on the other hand, the benchmarks are met and only the occasional student falls below the established criteria, this important data source is frequently not utilized as a method of feedback to stimulate further inquiry. The potential question then becomes the following: in addition to assessing individual student competencies, what else might educators do with the information gathered about student self-assessment in field and about how competent field instructors evaluate their students to be?

The position advanced in this paper is that field data showing that benchmarks have been met still provide information that can be used in the assessment feedback loop. These data support the continuous quality improvement process by which social work programs review, evaluate, and improve their curriculum content, teaching methods, and ultimately their practice mission through university and agency dialogue. In this paper, we present a model of how this process can be developed using data related to the policy competency as an example.

The Policy Enigma

Lower-than-average scores in Competency 8 (the policy competency1) in one school’s BASW and MSW programs became the starting point for developing a process to make better use of the accumulated field outcome assessment data. The data showed that student and field instructor assessments in the policy competency were generally rated below the other nine competencies over a two-year period from fall 2009 to spring 2011. Field faculty also noted that field instructors generally rated students higher on the practice behaviors related to this competency than the students rated themselves. Social work practitioners understand that policy affects service delivery, and they actively engage in policy practice. Social workers know the history and current structures of social policies and services, the role of policy in service delivery, and the role of practice in policy development. According to the CSWE (2008) social workers must, “analyze, formulate, and advocate for policies that advance social well-being; and collaborate with colleagues and clients for effective policy action” (p. 6).

To some extent, lower scores for policy are not a surprise. The social welfare policy curriculum was identified more than 50 years ago as a struggle among schools about what and how to teach a broad perspective of social needs and social change in social work curriculum (Kendall, 1955). Kendall continued by suggesting that it is not enough to instill appreciation of the forces of social change; social work educators must teach the formulation of enlightened social policy. Anderson and Harris (2005) define the purpose of social welfare policy courses as enhancing students’ understanding of policy formulation, implementation, and evaluation. These are complex concepts and required policy courses are ranked by undergraduate students as among their least preferred (Wolk, Pray, Weismiller, & Dempsey, 1996). Gordon (1994) adds that “students generally see policy skills and policy courses as peripheral to their interests” (p. 65).

Historically, while most of the original seven BASW and MSW field education learning goals identified by this school have evolved into EPAS competencies, policy content was originally included only briefly as a learning objective (now called practice behaviors) under the goal of understanding the human service delivery system. Furthermore, the majority of current field instructors received their MSW educations before EPAS competencies were introduced and did not experience the same emphasis in their coursework or field. Thus, it was logical that field instructors were inclined not to include policy content in student learning agreements and that beginning students viewed policy courses and skills as unrelated to their primary professional interests.

The research and pedagogical question at hand is how to use the field assessment data to enhance policy content in field placements and potentially bring the policy competency rating in line with the others. Our local interest in these questions was reinforced by informal conversations with colleagues in other programs, who reported similar data patterns for the policy competency in their assessment results.

The search became two-fold. First, could we develop a better understanding of the factors that seem to influence these patterns of lower ratings for policy in general and lower self-ratings by students? Second, using these answers, could we develop some type of training for field instructors that would lead to improvements in the policy content in field? Put simply, we were trying to determine how we might emphasize policy content within field education and what we might do within the field curriculum and field instructor training to improve, over time, student policy competency, as assessed by both field instructors and the students themselves.

Methods

As indicated earlier, this study was initiated after an examination of field evaluation outcome data revealed that the student and field instructor evaluations for the school’s BASW and MSW field programs consistently rated the policy competency (together with the research and theory competencies) above the benchmark but below the other competencies. The field team presented the outcome data to the Field Advisory Committees in early 2012. In response, and in collaboration with the Field Advisory Committees, the field team compiled a list of the required policy practice behaviors and policy-related activities in which students might engage and include in their learning agreements. These items were included in two surveys designed to better understand why field instructors continuously rated students higher in their policy competency than students rated themselves and why the policy competency was generally rated lower than other competencies. The surveys were conducted at the end of spring semester 2012 via an online survey tool. In order to ground respondents in the policy issue, respondents were reminded of the wording of the policy competency in the cover letter.

Sample
Surveys were completed by social work students and social work field instructors at a Midwestern research university. All BASW and MSW students (N=335) and all field instructors (N=290) for the 2011–2102 academic year were sent an email from the field office asking them to complete the survey. Fifty-one percent of field instructors (n=149) and 42% of students (n=140) responded. Student respondents included 14% BASW students (n=19), 34% first-year MSW students (n=47) and 52% second-year MSW students (n=74). The second-year MSW student number includes both advanced standing students and students from the two-year MSW program. The majority of the MSW student respondents stated they were on a clinical track (n=116) versus a macro track (n=9).

Measures
For each of the 14 required policy practice behaviors listed, field instructors were asked to indicate which areas they believed they were able to teach their student(s), how often they discussed policy-related topics with their student (1=not at all to 5=weekly), and to rate their self-perception of their own knowledge base in policy as well as their competency to teach social work students both agency policy and social policy (1=not competent to 5=fully competent). Students were asked to indicate which areas they believed they learned at their field placement and how often they discussed policy-related topics with either their field instructor or another agency staff member (1=not at all to 5=weekly). Field instructors and social work students were both asked open-ended questions as to why they believed field instructors rated students higher for their policy competencies than students rated themselves.

Analysis
Data was analyzed using SPSS 21. For each policy practice behavior, the proportion of field instructors and students indicating teaching or learning that behavior, respectively, were compared using Pearson chi-squares. In addition, student endorsement of each practice behavior is compared by year in field placement (BSW, first-year MSW, and second-year MSW) because of the different competency expectations for each year in field. The mean frequency with which field instructors and students indicated talking about both agency and social policy were compared using t tests. An alpha of .05 was used for determining statistical significance.

Results

Survey responses revealed that, in general, field instructors and students indicated a similar perception of what they taught or learned in the area of policy practice. The most commonly endorsed policy practice behaviors for both field instructors and students included, “the role of agency policy in service delivery at field placement” and “how to advocate for clients at field placement.” Among policy practice behaviors least frequently endorsed were “advocate with and lobby administrators to influence policies that impact clients and services” and “how to formulate polices that advance the social well-being for clients served at field placement.” The difference between student and field instructors was statistically significant in only three of the fourteen policy practice behaviors and activities listed (Table 1). Field instructors indicated that they believe they taught “how to advocate for clients at field placement” and “suggest new or revised policies to improve service delivery” more often than students believed they learned them. In contrast, for the activity, “write a letter to legislator to support a current bill that is beneficial to the wellbeing of agency clients,” students indicated they engaged in this policy activity more than field instructors reported they taught it.

Table 2 compares student perceptions of the policy practice behaviors and activities they learned based on their year in field placement. A significantly higher proportion of second-year MSW students indicated they had learned to write a letter to a legislator. There were no other significant differences. This finding is consistent with the school’s overall policy curriculum, as BASW and first-year MSW students focus on policy analysis in their policy course, while second-year MSW students focus on policy advocacy, which includes letter writing.

There were no significant differences between field instructors and students in perceptions of how frequently they discussed agency policy (M=3.23 vs. M=3.11, t=1.01, p=.314) or how frequently they discussed social policy-related issues (M=3.56 vs. M=3.46, t=.796, p=.427). Students reported speaking less frequently to other agency staff regarding both agency-related policy (M=3.17) and social policy (M=2.57).

Finally, half of the field instructors (n=75) indicated they felt mostly competent in their role advising students in policy practice and one-fifth indicated they felt fully competent (20%, n=30). Field instructors clearly saw themselves as more knowledgeable concerning social policy than their students, with 80% (n=119) saying their knowledge was greater or much greater than their students.

Students and field instructors were both asked to report on the reasons they believed field instructors rated students higher for their policy competencies than students rated themselves. While the number of respondents who provided answers was low, and the comments themselves were quite diverse, two common themes did emerge. First, students are engaged in policy but do not realize it; students are not aware of their own knowledge and/or do not recognize the connections between their work and policy. Second, students generally lack confidence and underestimate themselves, particularly in the area of policy practice.

The Field Office Intervention
After reviewing the survey results and considering possible strategies to improve both field instructor and student perceived policy competencies, members of the field team decided to develop and deliver a continuing education (CE) offering to field instructors on the role of policy practice in reflective supervision. The field team met with the Field Advisory Committees to present the survey findings and seek their input on the content of the CE offering.

Building from these ideas, the field team then worked with faculty members who teach required policy courses to create a 1.5-hour continuing education workshop to enhance the policy content in field by helping field instructors think about policy practice in the context of reflective supervision. The overall goal of the workshop was to increase field instructors’ understanding of what policy is so they could better identify policy issues and help students design activities to develop competence in the required policy practice behaviors. The continuing education workshop was presented at three fall 2012 field instructor orientations with 167 field instructors in attendance, representing almost half of all field instructors in the School’s state-wide field education program. The workshop’s objectives included the following: 1) review reflective supervision best practices; 2) define policy practice within field education; 3) explore strategies and approaches for teaching policy practice; and 4) integrate strategies for teaching policy in reflective supervision.

Field instructors were provided the same introductory policy content that students receive in their BASW or MSW-foundation curriculum. This information was provided so field instructors could better understand what students were learning in the classroom and thus be able to integrate and reinforce this information through field activities. The importance of policy was discussed, as people are both recipients and providers of social welfare. The presentation then defined policy practice, policy advocacy, social welfare policy and programs, and how they interact with one another. Teaching strategies identified in the workshop—based on Fortune, McCarthy, and Abramson (2001)—included 1) provide a context and link practice principles through explanations; 2) connect classroom content with practice; 3) give examples of application to real situations; 4) provide feedback to students through process or other recordings; and 5) use repetition.

The workshop design also included a time for field instructors to discuss best practices and an opportunity to assess their own policy competency. These discussions generated a number of practical ideas that can be grouped in four broad action themes. First, help students stay up-to-date with current policy. This includes encouraging students to attend trainings and/or policy development meetings and regularly discussing how policies can impact and/or restrict clientele. Second, reflect on the importance of policy both in terms of the ways in which policy impacts a certain population/group and how policy impacts an agency. Third, communicate ways in which policy practice is applied. This may include exposing students to grants and budgets and allowing students to practice writing policy. Fourth, foster growth and acceptance within policy practice by pushing students to understand policies with which they may not agree and having students research and discover the ways in which policy changes over time.

Participants were asked to indicate their understanding of policy competency both before and after the workshop (1=poor to 5=excellent). Overall, participants’ understanding of the topic significantly improved from M=2.85 to M=4.04 (t=15.523, p<.001).

In addition to the planned CE event for field instructors, the field team also developed a strategy for enhancing the ability of field liaisons to support both field instructors and students in relation to the policy competency. Social work students are divided into small groups where a liaison contributes to, supports, and monitors their progress in field education. Liaisons play an important role reviewing student learning agreements and meeting with students and field instructors to review learning goals and competencies. At the summer 2012 liaison orientation, liaisons were made aware of the program’s efforts to increase field instructors’ and students’ understanding and skills in the policy competency, and they were also asked to review learning agreements in future semesters with a particular focus on policy competency.

Discussion

The CSWE’s (2008) “Educational Policy 4.0—Assessment” in the 2008 EPAS notes the importance of assessment in competency-based education and states that “data from assessment continuously inform and promote change in the explicit and implicit curriculum to enhance attainment of program competencies” (p. 16). In many, if not most, social work programs, data from field instructor evaluations of student competencies in the field setting are collected every semester or quarter. Many programs also collect data from students in field placements concerning their self-perceptions of their competencies.

The accreditation standards require the submission of a complete assessment plan at the time of accreditation and reaffirmation (AS 4.0.1), and AS 4.0.3 extends the plan by stating that a program must describe “the procedures it employs to evaluate the outcomes and their implications for program renewal. It (the program) discusses specific changes it has made in the program based on specific assessment outcomes” (CSWE, 2008, p. 16).

While the data from field instructors and students likely play an important role in meeting the recently enacted CSWE requirement that programs post competency assessment results at least every other year, it is not known to what extent the field data fit into the programs’ broader responses to the “continuous quality improvement” culture encouraged by AS 4.0.3. This is challenging, because the social work literature provides few good examples of how programs can utilize field data constructively to improve overall program quality and outcomes, especially when benchmarks have been met and nothing appears to be “unsatisfactory.” For social work schools and departments whose field operations and resources are already stretched, this would seem like an unnecessary burden to consider undertaking. Yet the activities detailed above provide an example of a relatively low-cost and sustainable approach that is consistent with the expectations of AS 4.0 and that should benefit the program in multiple ways.

In its most basic terms the approach includes these components:

1)    Collect good assessment data from field instructors and students.

2)    Review the data to see which competencies tend to be lowest, and where there are unusual findings or discrepancies.

3)    Learn more from field instructors and students via a simple survey concerning how the competency was operationalized in field settings, together with these stakeholders’ ideas and perceptions.

4)    Work with academic course instructors to prepare field instructor and liaison training designed to better inform the target groups about the content area.

The start-up challenges in implementing this approach include the following: 1) the need for a “champion” willing and able to lead or facilitate the process; 2) commitment to and support for a program culture that includes assessment and continuous quality improvement; 3) faculty and/or students with the research skills to conduct the project. Potential benefits for the program include the following: 1) heightened awareness of the curriculum content area and competency chosen for special attention (this focus can be changed each year); 2) increased involvement of key program stakeholders, such as academic faculty, field advisory councils, field liaisons, and field instructors in developing survey tools, interpreting results, and preparing the actual field instructor training; 3) ideally, improved ratings over time on the target competency.

The overall result of this approach is to aggregate individual assessments of social work competencies in field settings and then utilize them for program evaluation and continuous improvement. Along the way, there will be increased interactions between the field team and the relevant committees responsible for the academic curriculum. Collaboration will be needed for interpreting the data, developing the field instructor training, and continuing to enhance faculty awareness of how their content area is covered in field placements. It should also help to reinforce classroom and field integration.

How well did the policy enigma example presented above turn out? In terms of why field instructors and students both had relatively low scores on the policy competency, we cannot explain the reasons definitively but have some ideas from the open-ended data and from a review of the policy academic curriculum. While at field placement, students are asked to participate in and to understand both agency policy and social policy. However, their required policy courses focus on social policy and not the details of agency policy. As a result, most students have social policy as their frame of reference for policy practice, in contrast with their field instructors, who are immersed in the day-to-day of agency policy. Students thus have a firmer grasp of social policy, and field instructors have a stronger grasp of agency policy practice behavior. The data in Table 1 above provide some support for this possible explanation. Students indicated they were able to learn social policy practice behaviors at a somewhat higher rate than agency policy practice behaviors.

A second explanation is derived from the history of policy content in the field curriculum at this school. Prior to the 2008 EPAS and its focus on competencies, field curricula were built around program goals and objectives, and none of this school’s goals included policy as a core element. Because many of the current field instructors are graduates of the school’s programs, it would be logical to conclude that their own awareness of policy content—and their ability to identify policy work in students’ activities—would be somewhat limited. Furthermore, the general pattern among students of valuing practice content and under-valuing policy content may translate into the actions of these students when they become field instructors. They are not likely to integrate policy into their own practice and thus would have lower awareness of the policy dimension of their work. It is our expectation that efforts such as the one described here will help to break this cycle.

In terms of the second part of the policy enigma—why field instructors rated students higher than the students rated themselves—ideas from the qualitative data suggest some combinations of subject matter perceived to be complex, together with lack of experience and/or competence on the part of students and field instructors. Interestingly, the data show that both field instructors and students appear to view the learning process and opportunities to learn about aspects of the policy competency in very similar ways.

In terms of using assessment data to develop a training event for field instructors that would lead to improvements in field instructor awareness of policy content and the policy dimension of their work, the narrative above shows that developing a content-rich workshop is indeed feasible. The data became the springboard for conversations with Field Advisory Committees and core academic faculty, a benefit in and of itself. As a result, a well-received CE workshop was developed and provided as a field instructor training event, supplemented by additional work with field liaisons, all of which was intended to improve the policy content in field placements.

But what of the ultimate effect? Has there been an improvement in the policy competency scores, and can any such improvement be linked directly to the field instructor workshop and liaison training? The data from the 2012–13 academic year that began shortly after the workshop revealed that both field instructor and student assessments of the policy competency remained near the bottom of all competencies measured in field assessments except among students in our macro concentration, among whom it was 5th from the bottom. In part, the lack of considerable progress is not surprising, given the fact that the intervention had just occurred; the most optimistic perspective assumed it would take several years for the increased attention to the policy competency to be translated into greater awareness among a majority of field instructors and thus higher assessment scores. Field instructor and student assessment data collection will continue, and the results for the policy competency will be examined annually for hints of progress.

A related issue has to do with a “ceiling effect” inherent in social work curricula. It could be argued that the likelihood of instructor-rated and student self-assessments of competence in the policy area are highly unlikely to exceed those of more practice-oriented competencies, regardless of the quality of any intervention. The one exception may be among macro practice and policy majors for whom policy is their practice.

Even if assessments of the policy competency by field instructors and students improve over time, can the results be attributed to this field instructor training intervention? The answer is a simple and straightforward no, because there are many complex and uncontrollable factors that could affect these assessments. What the field team attempted to accomplish was to shine additional light on various aspects of the policy competency for field instructors and liaisons in the hope that this additional attention would enhance the learning process related to policy at the individual placement level. This is a benefit that goes beyond having competency data showing that students meet predetermined benchmarks related to the policy competency.

The continuous quality improvement culture of this particular social work program supports ongoing investigation of this model for enhancing the value and utility of field assessment data. Thus over the next two academic years the field team will target—through additional surveys and field instructor and liaison training—other lower-scoring competencies. As in many other programs, these are the competencies related to research and to theory/HBSE. Over time if there are improvements in policy competency scores and in the scores on these other competencies, a viable case could be made for the role of these field interventions as viable activities to improve field assessment outcomes. At the same time, all of these subject areas share the ceiling effect described above, limiting the scope of possible improvements relative to the practice competencies. In the interim the program is benefitting from more widespread focus and discussion—in the field and in academic course work—concerning the policy competency. This attention is also likely to enhance the field programs’ success in connecting the classroom with the practice setting (AS 2.1.1).

In closing, it is important to note two important limitations of the research. First, field instructor and student respondents to the online surveys were anonymous and not in matched pairs. Thus, in any description comparing field instructor and student responses, it cannot be determined what proportion of the responses represent cases where the students and field instructor respondents were in the same agency. This proportion could range from zero to 100%. Second, response rates for both groups were somewhat low, with even lower response rates to the important open-ended questions included in the surveys. Despite these limitations, there is much to be learned about a potential model for utilizing field assessment data to facilitate continuous quality improvement, especially when all benchmarks have been met.


Anderson, D., & Harris, B. (2005). Teaching social welfare policy: A comparison of two pedagogical approaches. Journal of Social Work Education, 41(3), 511–526. doi:10.5175/JSWE.2005.200303120

Berengarten, S. (1961). Educational issues in field instruction in social work. Social Service Review, 35(3), 246–257.

Bogo, M., Regehr, C., Hughes, J., Power, R., & Globerman, J. (2002). Evaluating a measure of student field performance in direct service: Testing reliability and validity of explicit criteria. Journal of Social Work Education, 38(3), 385–401. doi:10.1080/10437797.2002.10779106

Bogo, M., Regehr, C., Power, R., Hughes, J., Woodford, M., & Regher, G. (2004). Toward new approaches for evaluating student field performance: Tapping the implicit criteria used by experienced field instructors.  Journal of Social Work Education, 40(3), 417–425.doi:10.1080/10437797.2004.10672297

Bogo, M., Regehr, C., Woodford, M., Hughes, J., Power, R., & Regehr, G. (2006). Beyond competencies: Field instructor’s descriptions of student performance. Journal of Social Work Education, 42(3), 579–593. doi:10.5175/JSWE.2006.200404145

CSWE. (2008). Educational policy and accreditation standards. Retrieved from http://www.cswe.org/file.aspx?id=13780

Fortune, A., McCarthy, M., & Abramson, J. (2001). Student learning processes in field education: Relationship of learning activities to quality of field instruction, satisfaction and performance among MSW students. Journal of Social Work Education, 37, 111–124. doi:10.1080/10437797.2001.10779040

Gordon, E. B. (1994). Promoting the relevance of policy to practice: Using the ADA to teach social policy. Journal of Teaching in Social Work, 19(2), 165–176.

Jarman-Rohde, L., McFall, J., Kolar, P., Strom, G. (1997). The changing context of social work practice: Implications and recommendations for social work educators. Journal of Social Work Education, 33(1), 29–46. doi:10.1080/10437797.1997.10778851

Kendall, K. (1955). Curriculum policy and educational practice. The Social Service Review, 29(2), 117–124.

Knight, C. (2001). The process of field instruction: BSW and MSW students’ reviews of effective field supervision. Journal of Social Work Education, 37(2), 357–379. doi:10.1080/10437797.2001.10779060

Lyter, S. C. (2012). Potential of field education as signature pedagogy: The field director role. Journal of Social Work Education, 48(1), 179–188. doi:10.5175/JSWE.2012.201000005

Rocha, C.J. (2000). Evaluating experiential teaching methods in a policy practice course: The case for service learning to increase political participation. Journal of Social Work Education, 36(1), 53–63. doi:10.1080/10437797.2000.10778989

Sherraden, M. S., Slosar, B., & Sherraden, M. (2002). Innovation in social policy: Collaborative policy advocacy. Social Work, 47(3), 209–221. doi:10.1093/sw/47.3.209

Schulman, L. S. (2005 Summer). Signature pedagogies in the professions. Daedelus, 134(3), 52–59. doi:10.1162/0011526054622015

Wayne, J., Bogo, M., & Raskin, M. (2006). The need for radical change in field education. Journal of Social Work Education, 42(1), 161–169. doi:10.5175/JSWE.2006.200400447

Wayne, J., Bogo, M., & Raskin, M. (2010). Field education as the signature pedagogy of social work education. Journal of Social Work Education, 46(3), 327–339. doi:10.5175/JSWE.2010.200900043

Wolk, J. L., Pray, J. E., Weismiller, T., & Dempsey, D. (1996). Political practica: Educating social work students for policymaking. Journal of Social Work Education, 32, 91–100. doi:10.1080/10437797.1996.10672287