Download a PDF of this article

Abstract: Simmons School of Social Work inaugurated a remote field review during the spring of 2012 to replace one traditional face-to-face (F2F) field visit. The field education department surveyed its field instructors and liaisons and some students to ascertain their response to this change; this article reports the findings of these surveys.

Technological advances have made possible a number of innovations that may contribute to the more effective use of limited resources in social work education, including field education. Clinical supervision, training, and field education can now be done online (Birkenmaier, Berg-Weger, Wilson, Banks, Olliges & Delicath, 2005; Leyva, 2012; Rosenfeld, 2012; Wolf, 2011), and schools of social work have developed entire online programs (Pringle-Hornsby & Grey, 2012). Simmons School of Social Work’s MSW program has a clinical concentration, with students generally completing two three-day internships. The traditional model included two face-to-face field visits, one in each semester. In the spring of 2012, the School instituted a spring remote field review to replace the traditional face-to-face (F2F) final field visit for second-year MSW students. This was concurrent with an update of the field evaluation form in accordance with CSWE competencies. The remote field review was instituted for students in good standing in their final semester of their advanced clinical placement. Agencies and students were notified of this change by their field liaison and the Director of Field. A F2F review was conducted if a field instructor, student, or liaison requested it, whether the student had a problem or preferred a F2F. The expectation was that the final field review would ideally be conducted online; however, only one review was conducted via Skype, and the other remote reviews were held over the phone. The Simmons School of Social Work field education department surveyed its field instructors and liaisons to ascertain their response to this change. This article presents the preliminary data from the surveys of field instructors and field liaisons, as well as the response to one question about the remote field review on the student field survey.

The Study

Sample: Field Instructors, Liaisons, and Students

All 156 Simmons field instructors who were supervising a student in their advanced clinical placement were queried. Eighty-five (54%) field instructors responded to the study, of which 51 had remote field reviews. The majority of the respondents (79%) were located in agencies in the Greater Boston area; others worked in agencies in the North Shore, the South Shore, Southwestern Massachussetts, Cape Cod, Western Massachusetts, Rhode Island, and New Hampshire. The majority of field instructors (67%) had one to nine years of experience supervising MSW students. Most practiced in an outpatient mental health agency (including substance abuse treatment) or in a hospital setting (psychiatric or medical); other fields of practice included geriatrics, child welfare, college counseling, residential treatment, day care and schools, and forensics.

All 26 field liaisons were surveyed by the next fall; 13 (50%) responded to the survey. The two researchers were not included in the sample. All field liaisons were experienced clinical social workers; they had anywhere from two to nine years experience as a liaison. All but three held adjunct appointments. According to the Simmons School of Social Work Field Education Manual (2013), the field liaison “monitors and evaluates the educational process, supports and mentors on their practice and professional development, and provides support to agencies in the form of consultation, mediation, advocacy, problem solving and negotiation.”

A field survey was done in spring 2012 with all 157 students in their clinical concentration. The survey covered a number of topics and included one question about the remote field review. Forty students (25%) answered the remote review question. There is no identifying information about these students.

Data Collection

Data were collected in three anonymous different electronic surveys through Survey Monkey. The survey asked field instructors these questions: First, they were asked,

“From your perspective, what are the objectives of the Year II final semester field review?”

Then they were asked to respond on a Likert scale to these questions:

“How did the remote Year II final semester field review compare to previous in-person field visits that you have had?“ (better, no difference, worse)

“How well did (it) meet your objectives?“ (well, adequately, not well)

“How helpful was (it)?“ (very, somewhat, a little, not)

In addition, respondents were given space to explain their answers.
The field liaisons were also asked,

“From your perspective, what are the objectives of the Year II final semester field review?”

They were also asked to respond on a Likert scale to these questions:

“How well did the remote Year II final semester field review meet your objectives?” (well, adequately, not well)

“How helpful was (it)?” (very, somewhat, a little, not)

Again, space was provided for them to explain their answers.

Students were asked one open-ended question:

“Please provide feedback on the remote review.”

Space was provided to explain their answers.

Findings

Field Instructor Responses
Objectives of the field review. Most field instructors mentioned several objectives of the field review. Almost half the 85 survey respondents (48%) said that the primary objective of the field review is to evaluate interns’ progress over the year, most making specific reference to the assessment of competencies. Thirty-seven field instructors (44%) said that the field review should allow for discussion of an intern’s “learning goals”, “growing edges” and even “obstacles”. Evaluation of growth and future goals was often integrated with discussion of the intern’s future career in social work.

Twelve field instructors (19%) focused on other aspects of the field review besides evaluation, mentioning the importance of “processing the internship experience” with the student. Seventeen (20%) used the final field review to celebrate the student’s success, and 15 (18%) addressed issues related to termination.

Some respondents mentioned objectives related to the internship itself. Eleven (13%) used the field review to assess the strengths and challenges of the internship experience, and one (1%) hoped for feedback on her supervision. Three respondents (4%) saw the field review as offering a collaborative experience between the agency and school.

Satisfaction with the remote field review. Levels of satisfaction with the remote field review varied. Of the 51 respondents who had a remote field review, satisfaction with the review was variable. Two (4%) said that the remote review was better; 29 (56%) said that there was no difference, and 20 respondents (39%) said it was worse.

One of the two field instructors who thought that the review was better said, “It’s a great twist on the final visit.” The other respondent appreciated the convenience, efficiency, and savings of time; as this person put it, “It was short, sweet, and to the point.”

Over half the respondents had no preference between the remote review and the F2F field visit; they simply said that it was the “same conversation” or “same material” as the in-person visit. Those who found no difference said, “It met my objectives/expectations,” and “it got the job done.” Several respondents said that a remote field review is acceptable as long as there has been an in-person visit in the fall. Several added that a remote review would not be acceptable if the intern had problems.

Those field instructors who thought that the remote field review was worse than the F2F visit gave several reasons. Several found the interview “awkward” or “cumbersome”, some referring to difficulty with the technology. Most of the field instructors who disliked the remote field review mentioned the impersonal nature of a remote review and difficulty with attending to process: “We covered what we needed to, but it lacked the depth and/or affective connection of an in-person meeting.” Several believed that a remote review makes it difficult to monitor body language and nuances. A few respondents said that they found it difficult to address student challenges in a remote field review. One respondent believed that the remote field review “did not show respect to the agency or to the student”.

Field Liaison Responses
Objectives of the field review. When asked about their objectives for the Year II final semester review, all the field liaisons surveyed mentioned the objective of evaluating the students’ competencies and/or progress toward learning goals. Five (38%) saw the remote field review as also dealing with potential student obstacles and problems; as one put it, “Students vary tremendously in their openness to feedback, so sometimes a [face to face] field visit really highlights something useful as far as a direction for learning or helps unstick a stuck place in the learning or in the relationship. Sometimes students are not particularly open; it’s helpful to see that in play.” Six (46%) used the field review to offer career consultation to the student. Eight (62%) talked about the process of the field review, including celebrating the student’s strengths and attending to termination and closure. Three (23%) field liaisons used the field review to monitor the student’s experience in the internship.

Satisfaction with the remote field review. The field liaisons were not enthusiastic about the remote field review. Only one said the remote review met the objectives well. Six (46%) said that it met their objectives adequately, but four (31%) said that the remote field review did not meet their objectives well. Similarly, no respondents said that the remote field review was very helpful; five (38%) thought the remote field review was somewhat helpful, four (31%) saw it as a little helpful, and two (15%) saw it as not helpful. Two (15%) respondents did not respond to these questions. Only one field liaison mentioned the advantage of having a remote field review, saying, “It was helpful in regards to my time because I did not have to travel to the site for the visit.”

Field liaisons gave several reasons for their dissatisfaction with the remote field review, many having to do with the difficulty in attending to process. They said it was awkward, lost “cues and nuances” like facial expressions, did not promote meaningful exchange, and/or did not allow for celebration of the student. Like the field instructors, some liaisons said (consistent with department policy) that the remote field review should not be used if students have difficulties or conflicts with the field instructor. One respondent said, “Some students seemed disappointed around not having opportunities to get more feedback and have meaningful discussions.” Others thought that the remote review did not value the liaison’s abilities, or “minimized the public relations aspect of the site visit…an opportunity to concretely demonstrate our interest and support.”

Student Responses
Most of the 40 students who responded accepted the remote field review. One student (2%) thought the remote review was better than a face-to-face meeting: “I felt that more was accomplished during the Skype meeting.” Eleven students (28%) were pleased with the remote field review, saying that “it saved a lot of time” and was “convenient”; “a fine substitute for the actual visit in the light of resource management.” Eighteen (45%) believed that there was essentially no difference between the remote field review and F2F field visit or were ambivalent; one said, “It was efficient but not value added.”

However, 11 students (28%) expressed negative opinions of the remote field review, calling it “awkward” and “impersonal.” One student believed that the remote field review “cheapened” the experience of the field review, saying, “If we, as clinicians, believe that communication and connection is best facilitated by a face-to-face encounter, why would we not heed our own advice for a field meeting?” Another student felt devalued: “It didn’t feel great to have done all of this work […] and then not even warrant a [face to face] visit from my advisor to discuss my growth and progress.”

Reflections

Objectives of the Field Review
Most field instructors and field liaisons agreed that the main objective of the field review is to evaluate the intern’s progress from the first to the second semester. Several respondents made direct reference to the assessment of the intern’s competencies. This finding suggests that the call for better definition of competencies in field (Ligon & Ward, 2005) has been met by the efforts of CSWE to standardize competencies across the educational trajectory, including in field (CSWE, 2008).

However, Morley and Dunston (2013) criticize the excessive focus on competencies, saying, “An emphasis on competencies or technical, formulaic and ostensibly objective approaches to social work has been adopted in an attempt to manage the increasing complexity and uncertainty of social work practice” (p. 143). The remote field review seemed more conducive to evaluation of competencies than to attending to process. Field instructors and liaisons in the study wished to encourage students to attend to their feelings about the completion of their field experience, like pride in their accomplishments or conflict at termination between feelings of attachment and a desire to progress (Baum, 2011). Some respondents also mentioned “processing the internship experience.” Researchers have demonstrated direct connection between interns’ competencies and the relational process of supervision: field instructors who were trained in a developmental relational approach to supervision (DRAFS) perceived their students as progressing more rapidly than others in client assessment and planning and implementation (Deal, Bennett, Mohr, & Hwang, 2011).

Many authors have stressed the importance of collaboration between social work schools and their affiliated agencies (Bogo & Power, 1992; Frumkin, 1980; Globerman & Bogo, 2003; Lawrance, Damron-Rodriguez, Rosenfelt, Sisco, & Volland, 2007; Peleg-Oren & Evan-Zahav, 2004). Oversight of the placement experience is considered a central function of the liaison: “The liaison interprets policies and standards to field instructors […] provides feedback both to the student and to the field instructors, […and] is responsible for making recommendations as to the continued use of field sites” (Lyter, 2005, p. 2). Collaboration between the school and agency, in the opinion of respondents in this study, involved not only assessing the strengths and challenges of the internship experience but also feedback from the liaison on the field instructor’s supervision, and strengthening of the relationship between liaison and field instructor. Lyter (2005) says, “How does one measure the goodwill, public relations, marketing, and problem prevention that the field liaison likely achieves when she/he visits an agency regularly?” (p. 9). Further research is needed to assess whether online field reviews differ from F2F visits in promoting collaboration between social work schools and agencies.

Satisfaction with the Remote Field Review
The Simmons study found that a number of respondents, especially students, found the remote field review to be satisfactory; this suggests that the move toward remote field reviews on the part of schools of social work may be positive. A study by Pardasani, Goldkind, Heyman, and Cross-Denny (2012) found that students found distance education to have a number of strengths as well as challenges. Many field educators believe that technology can enhance communication among field educators and students and agencies (Ligon & Ward, 2005), and other studies indicate that several forms of communication—face to face, email, or telephone—can be effective (Bennett & Coe, 1998; Danis, Woody, & Black, 2012). In fact, schools of social work in which almost all communication is online assert that students and field instructors use technology to create vibrant communities (Pringle-Hornsby & Grey, 2012). A number of respondents in this study appreciated the efficiency and time saving of the remote field review. The comments of the satisfied respondents were, accordingly, brief and to the point: “It got the job done.”

Even respondents who were satisfied with the remote field review added a caveat that the remote review would not have been as successful had their intern not been progressing satisfactorily. A number of respondents said that they expect to discuss learning goals and “growing edges” in the field visit. Some also referred to problems and “obstacles” to the student’s learning; it was not clear whether these respondents felt that the remote review did not allow them to openly touch on any areas where students did not demonstrate adequate competence even in the last semester field review. Further exploration is needed to ascertain whether a F2F visit is more effective than a remote field review in addressing difficult and potentially affect-laden issues in the triad of field instructor, liaison, and intern (Lindy, 2012).

As in other studies (Danis et al., 2012; Lyter, 2005; Woody and Black 2012), a number of respondents did not prefer the remote field review to a F2F visit. Responses that the remote review was “awkward,” often because of technological difficulties, may indicate that field instructors need more support in doing a remote visit. Since only one interview was conducted via Skype, that may indicate that agencies and liaisons need better access to online resources. Technological training should also be available to field liaisons in how to conduct Skype or phone field reviews. Lyter (2005) stresses the importance of “expectations”; the field instructors, liaisons, and even students who were accustomed to F2F visits may have found change difficult. It is possible that growing familiarity with technology, combined with increased comfort with an online review, will increase satisfaction with the remote field review. It is possible that those who are familiar with technology, such as millennial students, will be more satisfied with a remote field review.

Many of the respondents who disliked the remote field review alluded to the impersonality and reduced opportunity to attend to the affective or relational aspects of the learning process. Tsang (2011) emphasizes the importance of personalization, especially the presence of the field instructor and liaison, in a “market-driven…electronic mediated society” (p.376). He describes presence as “the state of alertness, connectedness and authenticity that enables the teacher to make a skillful and compassionate response” (p.376). Lindy (2012) emphasizes that the activation of “implicit procedural memories” (p.175) in the educational triad—student, field instructor, and liaison—can have a strong impact on student learning. According to Burgoon, nonverbal signals “carry a significant, often dominant portion of the meaning in face to face interchanges” (as cited in Knapp & Miller,1985, p. 381). For this reason, the dynamics of the triad may be best understood through the attunement to both verbal and nonverbal communication in a F2F encounter, where new meaning may be discovered from studying the experience in the moment. The comments of the dissatisfied field instructors and liaisons in this study were often lengthy and impassioned. The fact that a field instructor, a liaison, and a student felt that the remote field review was disrespectful or devaluing could be seen as their disappointment with an aspect of the implicit curriculum (CSWE, 2008). As Pardasani et al. (2012) say, “Caution must be advised when universities and schools of social work rush to implement new technologies without understanding their impact on students and instructors” (p. 418).

Need for Further Research
This exploratory study builds on the initial research into the field visit/review by Lyter (2005) and Danis et al. (2012). What is needed now is a national survey with a large sample in which a number of independent variables can be examined. Satisfaction with remote field reviews will probably differ depending on what agencies and students expect; some schools always use remote field reviews, and some never have. Students may be more satisfied with a remote review if they already have personal connection with their liaisons in a field seminar. It would be helpful to distinguish between schools with and without a clinical concentration, or BSW and MSW programs. The number, role, and training of field liaisons are different from school to school. Agency variables should be examined: geographical area, population served, micro or macro focus, etc. Field instructors’ satisfaction with remote field reviews may depend on factors like their age, experience, or practice area. Demographics should be gathered on students who are surveyed. Other variables to be explored include the frequency of all contact among students, field instructors, and liaisons.

Danis et al.’s (2012) research has an important feature: it makes a random assignment to field reviews in person, by phone, and by email (to this can be added Skype). Lyter’s (2005) research measures the correlation between field visit variables and the dependent variables of student performance and student satisfaction with the overall field experience. If social work educators are also concerned with safeguarding the commitment of agencies to training, to these dependent variables might be added the satisfaction of the field instructor with the school of social work.

The complexity of assessing the objectives of a field review raises the question of what constitutes an optimal field review. Field education departments often specify areas to address; however, a good field review depends on the clinical skill and sophistication of the participants, especially the field liaison. Field departments need resources to recruit, train, mentor and offer consultation to field liaisons. The expertise of these liaisons, as well as the perspective of field instructors and students, could better delineate the aspects of an optimal field review.

Our study suggests that qualitative research might add an important dimension to these quantitative studies. The comments of field instructors, liaisons, and students in this study enhanced the meaning of their numerical responses, especially the negative responses. However, an online survey has the same difficulties as a remote field review; attention to visual cues and verbal tone and further unpacking of the meaning of phrases such as “processing the internship” might best take place in Skype interviews or in-person focus groups. One limitation of this study is that the reporting authors are not objective; they are full-time clinically-trained field liaisons accustomed to an in-person field visit. Future studies should attend to the conventions of qualitative research, including member checking and attention to the researchers’ bias.

Conclusion

Preliminary data from the initiation of a remote final field review at Simmons College School of Social Work raise important questions for future evaluation of the remote field review. New technologies have promise in improving the effective use of limited resources in social work education, including field education. Remote field reviews are a natural part of online social work programs and may be welcomed by schools with agencies at a distance, or by liaisons and interns with busy schedules. However, despite the satisfaction of some respondents with the remote field review, this study suggests certain drawbacks as well. Morley and Dunston (2013) warn that field education is becoming construed as “an expensive imposition,” which makes it “vulnerable to under-resourcing and cuts implemented by scrutinizing, economic managers” (p. 144). The Council on Social Work Education stresses the importance of adequacy of resources to both the implicit and explicit curriculum, CSWE Policy 3.0. (CSWE, 2008). Careful assessment is needed to ascertain the costs as well as the benefits of initiatives like the remote field visit.


Baum, N. (2011). Social work students’ feelings and concerns about the ending of their fieldwork supervision. Social Work Education: The International Journal, 30(1), 83-97.

Bennett, L. & Coe, S. (1998) Social work field instructor satisfaction with faculty field liaisons. Journal of Social Work Education, 34(3), 345-352.

Berg-Weger, M., Rochman, E., Rosenthal, P., Sporleder, B. & Birkenmaier, J. (2007). A multi-program collaboration in field education. Social Work Education: The International Journal, 26(1), 20-34.

Birkenmaier, J.B., Wernet, S., Berg-Weger, M., Wilson, R., Banks, R., Olliges, R., & Delicath, T. (2005). Weaving a web: The use of Internet technology in field education. Journal of Teaching in Social Work, 25(1/2), 3-19.

Bogo, M. & Vayda, E. (1998). The practice of field instruction in social work: Theory and process. Toronto: University of Toronto Press.

Bogo, M. & Power, R. (1992). New field instructors’ perceptions of institutional support for their roles. Journal of Social Work Education, 28(2), 178-189.

Council on Social Work Education (2008). Educational policy and accreditation standards. Alexandria, VA: Council on Social Work Education. Retrieved from http://www.cswe.org/Accreditation/41865.aspx

Danis, F., Woody, D & Black, D. (2013). Comparison of face-to-face vs. electronic field liaison contacts. Field Educator 3(1). Retrieved from http://fieldeducator.simmons.edu/article/comparison-of-face-to-face-vs-electronic-field-liaison-contacts/

Deal, K.H, Bennett, S., Mohr, J. & Hwang, J. (2011). Effects of field instructor training on student competencies and the supervisory alliance. Research on Social Work Practice, 21(6), 712-726.

Evans, J. (2003). Links with learning: The use of online education for keeping in touch on placement. Journal of Practice Teaching in Health and Social Work, 4(3), 6-13.

Frumkin, M. (1980). Social work education and the professional commitment fallacy: A practical guide to field-school relationships. Journal of Education for Social Work, 16(2), 91-99.

Globerman, J. & Bogo, M. (2003). Changing times: Understanding social workers’ motivation to be field instructors. Social Work, 48(1), 65-73.

Knapp, M. & Miller, G. (1985). Handbook of interpersonal communication. Beverly Hills, CA: Sage Publications.

Lawrance, F., Damon-Rodriguez, J., Rosenfeld, P., Sisco, S., & Volland, P. (2007). Strengthening field education in aging through university-community partnership: The practicum partnership program. Journal of Gerontological Social Work, 50(1/2), 135-154.

Leyva, V. (2012). Online supervision of field education. Field Educator, 2(2) Retrieved from http://fieldeducator.simmons.edu/article/online-supervision-of-field-education/

Ligon, J. & Ward, J. (2005). A national study of the field liaison role in social work education programs in the United States and Puerto Rico. Social Work Education: The International Journal, 24(2), 235-243.

Lindy, J. G. (2012). Dynamics of the educational triad. Smith College Studies in Social Work, 82(2/3), 173-194.

Lyter, S. (2005). Social work field liaison agency visits: Factors associated with student performance and satisfaction. Arete, 28(2), 1-11.

Morley, C. & Dunstan, J. (2013). Critical reflection: A response to neoliberal challenges to field education? Social Work Education: The International Journal, 32(2), 141-156.

Peleg-Oren, N. & Even-Zahav, R. (2004). Why do field supervisors drop out of student supervision? The Clinical Supervisor, 23(2), 15-30.

Pringle-Hornsby, E. & Gray, B. (2012). Field learning in online social work programs. Field Educator 2(2). Retrieved from http://fieldeducator.simmons.edu/article/field-learning-in-online-social-work-programs/

Reisch, M. & Jarman-Rohde, L. (2000). The future of social work in the United States: Implications for field education. Journal of Social Work Education, 36(2), 201-214.

Rosenblum, A. & Raphael, F. (1983). The role and function of the faculty field liaison. Journal of Education for Social Work, 19(1), 67-73.

Rosenfeld, L. (2012). Web-based supervisor training: Real relationships in cyberspace. Smith College Studies in Social Work, 82(2/3), 216-229.

Tsang, N. M. (2011). Ethos of the day—Challenges and opportunities in twenty-first century social work education. Social Work Education: The International Journal, 30(4), 367-380.

Wayne, J., Bogo, M. & Raskin, M. (2006). The need for radical change in field education. Journal of Social Work Education, 42(1), 161-189.

Wolf, A. (2011). Internet and video technology in psychotherapy training and supervision. APA Psychotherapy, 48(2), 179-181.

Wolfson, G., Magnuson, C., & Marsom, G. (2005) Changing the nature of the discourse: Teaching field seminars online. Journal of Social Work Education, 41(2), 355-361.