Download a PDF of this article

The Council on Social Work Education (CSWE) designates field education as the signature pedagogy of social work education in its Educational Policy and Accreditation Standards (EPAS, 2008). The EPAS present a competency-based approach to social work education with measurable outcomes to evaluate the integration of knowledge and practice skills. Across many professions, the Objective Structured Clinical Examination (OSCE) has been used for several decades in a variety of settings as an assessment tool for educators to assess gaps between clinical instruction, to gather data for curriculum changes, and to identify the effectiveness of teachers and trainers (Ali et al., 1999; Anderson et al., 1991; Eliot et al., 1994; Regehr, Freeman, Hodges, & Russell, 1999; Reznick et al., 1998; Sloan et al., 1997; Warf, Donnelly, Schwartz, & Sloan, 1999). In addition, the OSCE can help to improve student confidence and to predict educational outcomes (Ytterbert et al., 1998).

In response to the lack of reliable measures of observed practice, several researchers (Bogo, 2010; Lu, Miller, & Chen, 2002) have adapted the Objective Structured Clinical Observation (OSCE) for application to social work in establishing reliable and concrete criteria for evaluating students’ actual practice performance and implementation of core skills (Bogo, Regehr, Logie, Katz, Mylopoulos, & Regehr, 2011)

In keeping with social work’s dual mission of attending to the well-being of vulnerable populations and to social justice values, how to train and to measure culturally empathic practice has also become an area of extensive study within social work (Bogo, Power, Regehr, Globerman, & Hughes, 2002; Carpenter, 2005; Cusimano, Rothman, & Keystone, 1998; Gambrill, 2001; Holden, Meenaghan, Anastas, & Metry, 2002). This article will discuss the implications of Lu et. al’s (2011) findings of the assessment of the Social Work OSCE for the development of cultural empathy in field education. Pedersen, Crethar and Carlin describe “Inclusive Cultural Empathy” as having two defining features:

  1. Culture is defined broadly to include “culture teachers” from the client’s ethnographic (ethnicity and nationality), demographic (age, gender, lifestyle, residence), status (social, educational, economic) and affiliation (formal or informal) backgrounds, and
  2. the empathic counseling relationship values the full range of differences and similarities or positive and negative features as contributing to the quality of that relationship in a dynamic balance (2008, p. 42).

The Social Work Objective-Structured Clinical Observation (SW-OSCE) Vignettes from Diverse Populations

Lu et al. (2011) have developed six distinct scenarios for use in evaluating MSW students; these simulate client scenarios and reflect diversity in clients’ age, gender, race, religion, sexual orientation and socioeconomic status (SES). The scenarios were developed to reflect representative clinical case presentations in social work settings that would provide opportunities for assessment and referrals, and would reflect the principles of cultural competency with diverse populations.

A 10–15 page transcript detailing a client’s behavioral profile and presenting problem was developed for each of the six clinical scenarios. In scenario 1, ‘Ms. Stein,’ a 45-year-old Orthodox Jewish woman, becomes anxious, angry, fearful, and panicked after being told that her adolescent daughter is pregnant. In scenario 2, ‘Ms. Lee,’ a 35-year-old immigrant from China, learns of her daughter’s truancy and poor academic performance. In scenario 3, ‘Mr. Shayan,’ a 20-year-old Iranian international student, feels hopeless, helpless and reports severe insomnia, substance abuse, and motivational problems. In scenario 4, ‘Ms. Williams,’ a 34-year-old African-American woman of Caribbean descent, feels anguish and guilt, and has trouble maintaining relationships with her fiancé and family, after being diagnosed as HIV-positive. In scenario 5, ‘Ms. Perez,’ a 59-year-old American-born Latina from a high SES background, reports that she is experiencing depression and fear regarding her future. In scenario 6, ‘Mr. Rodriguez,’ a married 24-year-old American-born Puerto Rican, feels confused about his sexual orientation and expresses distrust of his social worker.

The interviews are conducted with professional actors who are hired and trained in the SW-OSCE protocol. The actors are given the case scenarios and provided with prompts to use when the social worker does not seem to be responding to them.  The prompts contain specific verbatim responses and key information to convey to the social worker during the interview. It is essential to the validity and reliability of the protocol for the actors to consistently reproduce the same scenario and interaction with each social worker.   Practice sessions are held prior to the role plays to ensure the actor’s accurate interpretation of the interview scenario.  The methodology included videotaping sessions in a room with a one-way mirror to simulate a confidential interview and to minimize distractions (Lu et al., 2011).

Clinical Competence-based Behavioral Checklist (CCBC)

The development of simulated client scenarios is only one part of the SW-OSCE methodology.  Although researchers who adapted this protocol of student assessment have developed a variety of measures of evaluation criteria, a social work competence-based behavioral checklist is needed (e.g., Bogo et al., 2004; Lu et al., 2011; O’Hare & Collins, 1997).

Lu and colleagues (Lu et al., 2011; Lu et al., 2004) developed a Clinical Competence-based Behavioral Checklist (CCBC) for use with the SW-OSCE, which incorporates both quantitative and qualitative evaluation. Criteria were consolidated into five categories: interviewing skills, cultural competence, knowledge and intervention strategies, evaluation, and metacompetence (Lu et al., 2011).  In addition, each category of the checklist includes a comment section in which participants can elaborate on their scoring rationale.  Five ratings are used to assess each student/social worker performance in the simulated client scenario.  The five ratings are from the client actor, instructor, academic grader, a student-observer, and a non-expert video recorder (Lu et al., 2011).

Results of Pilot Testing of the Clinical Competence-based Behavioral Checklist (CCBC)

The pilot study of the CCBC measure of the SW-OSCE was conducted with MSW students in a required ethno-cultural issues course, at a large northeastern university. For this pilot study, the qualitative data analysis was based on a comparison of 20 students’ interviews of the case scenario 4, performed by the same actor:

Ms Williams, a 34-year-old, religious, immigrant African-American woman of Caribbean descent, felt anguished, was confused, and had trouble sleeping and maintaining a normal level of daily functioning after being diagnosed as HIV positive (Lu et al., 2011, p.177).

Several key findings emerged from the qualitative analysis process. First, the data from the pilot study show that clinical skills and cultural empathy are indeed distinct constructs. In Lu et al.’s (2011) analysis, although level of clinical competency varied among the students, core social work skills and behaviors such as reflective listening, physical attentiveness, empathic engagement, application of a strengths-based approach, and multifaceted assessments were evident. However, client/actors’ preferences for specific student interviewers were directly related to those students’ scores on cultural empathy: the actors consistently preferred the student with the highest cultural empathy score.  A strong rating in clinical skills, measured by a high overall CCBC score, was not correlated with higher perceived competence in cultural empathy by the actor. Second, though one might assume that there would be greater cultural empathy and rapport where there was a racial/ethnic match between the student and the client, this was not supported in the data. Third, as noted in other research, prior experience or professional training in social work did not predict clinical or cultural competence (Ellis, 2002) and test grades were not strongly correlated with practice competence or cultural empathy scores (Lu et al., 2011).

Implications for Field Education

What do these findings mean for field education? They support the importance of core social work skills such as reflective listening and a strengths-based approach. But interns need to learn cultural empathy in field in addition to traditional empathy. Inclusive cultural empathy is defined as “the learned ability of counselors to accurately understand and respond appropriately to the client’s comprehensive cultural context [including ethnicity, class, gender, disability and sexual orientation], both in its similarities and differences, which may include confrontation and conflict” (Ridley, Ethington & Heppner in Pederson et al., 2008). Multicultural competence begins with awareness and knowledge. Interns should be able to reflect on their own and others’ societal projection–the many impressions and assumption from various “culture teachers” that color their views.  They need to understand intersectionality–the complex effect on individuals of factors that combine to influence their identity, such as their unique family structure, socioeconomic class, gender, race/ethnicity, values, beliefs, and community stressors and resources. They need to attend to positionality–the constellation of factors that limit or endow social power. Given this understanding of social and cultural context, Pedersen et al. offer a list of basic skills that interns can learn for developing cultural empathy:

[Recognize and] set aside your own biases and judgments.
Listen for the core message in what the client says.
Listen for both verbal and nonverbal messages.
Be flexible and tentative to give clients room [to give voice to their experience].
Be gentle and keep focused on primary issues…
Check out if your empathic response was on target [and] is helpful.

This list suggests a need for a focus in field instruction on utilizing process recordings and journaling to surface assumptions and feelings, on teaching about the social and political context of individual behavior, and on honing interviewing skills informed by cultural understanding. This means seeing the whole person, in context and in relation to all dimensions of lived experience.

The second finding of the study was that ethnic match between intern and client does not predict cultural empathy. This can free interns from the assumption that they cannot be helpful to clients who are different than they are. It also moves from a definition of empathy based on similarity to an understanding of empathy as addressing  differences, which may include confrontation and conflict.  Pedersen et al. (2008) describe a triad training model in which counselors play three roles–client, pro-counselor and anti-counselor–to surface the possible positive and negative reactions clients may have during an interview. If resources are not available for this kind of role play, interns can try to anticipate and notice the client’s positive and negative responses. Pedersen et al. (2008) also list a number of “micro skills” that interns can use to recognize client resistance to the helping process, overcome defensiveness, address conflict, and recover from mistakes. For example, micro skills to recognize resistance include identification of values conflict, questioning, appropriate interpretation or confrontation, focus on topic, and/or mirroring.

Finally, the finding that prior practice experience did not predict clinical or cultural empathy, as perceived by others, can free interns from the undermining perception that lack of experience results in ineffective practice.  Field instruction can help to develop cultural empathy by examining students’ intentions, enhancing the compassion and motivation to build rapport with clients, and facilitating assessment and implementing helpful interventions based on cultural understanding. The fact that prior practice experience does not predict cultural empathy may also suggest that even experienced field instructors need support; schools of social work should offer training in cultural competence for field instructors as well as students.

In summary, the adaptation of the Objective Structured Clinical Observation to competency skills and social work education is an important step in the identification of reliable criteria for evaluating students’ practice skills. Initial research on the measure indicated that cultural empathy is a skill different from traditional empathy and that ethnic match between intern and client does not predict cultural empathy. In addition, prior experience is not necessarily associated with cultural empathy. These findings suggest that cultural empathy should be specifically taught in field as well as classroom, from an understanding of social projection, intersectionality and positionality. Cultural empathy includes attention to difference and conflict; interns need to attend to clients’ negative reactions as well as positive, and need to have skills to manage resistance and confrontation. Since prior experience may not include cultural empathy skills, novice interns should be reassured that their lack of experience does not result in ineffective practice; at the same time, experienced interns and field educators should be offered particular support in developing cultural empathy. Further research based on the SW-OSCE can shed important light on the fostering of cultural competence in social work education and field education in particular.


Ali, J., Adam, R. U., Josa, D., Pierre, I., Bedaysie, H., West, U., Winn, J., & Haynes, B. (1999).Comparison of performance of interns completing the old [1993] and new interactive [1997] Advanced Trauma Life Support courses. Journal of Trauma-Injury Infection and Critical Care, 46(1), 80-84.

Anderson, D. C., Harris, I. B., Allen, S., Satran, L., Bland, C. J., David-Feickert, J. A., Poland, G. A., & Miller, W. J. (1991). Relationship between student feedback and their performance in an objective structured clinical examination. Academic Medicine, 66, 29-34.

Badger, L.W., & MacNeil, G. (2002). Standardized clients in the classroom: A novel instructional technique for social work educators. Research on Social Work Practice, 72(3), 364-374.

Baez, A. (2005). Development of an Objective Structured Clinical Examination (OSCE) for practicing substance abuse intervention competencies: An application in social work education. Journal of Social Work Practice in the Addictions, 5(3), 3-20.

Bogo, M. (2010). Achieving competence in social work through field education. Toronto, Canada: Toronto University Press.

Bogo, M., Regehr, C., Logie, C., Katz, E., Mylopoulos, M., Regehr, G. (2011). Adapting objective structured clinical examinations to assess social work students’ performance and reflections. Journal of Social Work Education, 47(1), 5-18.

Bogo, M., Regehr, C., Hughes, J., Power, R., & Globerman, J. (2002). Evaluating a measure of student field performance in direct service: Testing reliability and validity of explicit criteria. Journal of Social Work Education. 38(3), 385-401.

Bogo, M., Regehr, C., Power, R., Hughes, J., Woodford, M., & Regehr, G. (2004). Toward new approaches for evaluating student field performance: Tapping the implicit criteria used by experienced field instructors. Journal of Social Work Education, 40(3), 417-426.

Carpenter, J. (2005). Evaluating outcomes in social work education. Social Care Institute for Excellence. Retrieved from http://www.scie.org.uk/publications/misc/evalreport.pdf

Cliff, N. (1987). Analyzing Multivariate Data. San Diego, CA.: Harcourt Brace Jovanovich.

Council on Social Work Education. (2008). Educational policy and accreditation standards. Retrieved from http://www.cswe.org/Accreditation/41865.aspx

Cusimano, M.D., Rothman, A., & Keystone, J. (1998). Setting standards for performance assessment: Defining standards of competent performance on an OSCE. Academic Medicine, 73(10), 812-8113.

Elliot, D. L., Fields, S. A., Keenen, T. L., Jaffe, A. C., & Toffler, W. L. (1994). Use of group objective structured clinical examination with first-year medical students. Academic Medicine, 69, 990-992.

Ellis, G. (2001). Looking at ourselves self-assessment and peer assessment: Practice examples from New Zealand. Reflective Practice, 2(3), 289-302.

Gambrill, E. (2001). Educational policy and accreditation standards: Do they work for clients? Journal of Social Work Education, 37(2), 226-240.

Harden, R. M., & Gleeson, F. A. (1979). Assessment of clinical competence using an Objective Structured Clinical Examination (OSCE). Medical Education, 13, 41-54.

Herie, M., & Martin, G.W. (2002). Knowledge diffusion in social work: A new approach to bridging the gap. Social Work, 47(1), 85-95.

Holden, G., Anastas, J., Meenaghan, T., & Metrey, G. (2002). Outcomes of social work education: The case for social work efficacy. Journal of Social Work Education, 38(1), 115-133.

Koroloff, N.M., & Rhyne, C. (1989). Assessing student performance in field instruction. Journal of Teaching in Social Work, 3(2): p. 3-16.

Lipman-Bluman, J. (1987). Individual and organizational achieving styles: A handbook for researchers and human resource professionals. Claremont, CA: Achieving Styles Institute.

Lu, Y. E., Miller, M. H. & S. Chen (2002). American revolution in mental health care delivery: Meeting the educational challenge. Journal of Teaching in Social Work, 22(1/2),167-182.

Lu, Y.E., Medina, C., & Kwong, M. H. (2004). Assessing clinical competency in field education. In Feng & Lu (Eds.) Social Workers’ Competency Assessment and Social Work Education Programs. Taipei, Taiwan: National University Press.

Lu, Y. E., Ain, E., Chamorro, C., Chang, C., Feng, J. Y., Fong, R., . . . Yu, M. (2011). A new methodology for assessing social work practice: The adaptation of the objective structured clinical evaluation (SW-OSCE). Social Work Education, 30(2), 170-185. doi:10.1080/02615479.2011.540385

Miller, M. (2004). Implementing standardized client education in a combined BSW and MSW program. Journal of Social Work Education, 40(1), 87-102.

National Association of Social Workers. (2007). The indicators for the achievement of the NASW standards for cultural competence in the social work profession. Washington, DC: Author.

O’Hare, T., & Collins, P. (1997). Development and validation of a scale for measuring social work practice skills. Research on Social Work Practice, 7(2), 228-238.

O’Hare, T., Collins, P., & Walsh, T. (1998). Validation of the practice skills inventory with experienced clinical social workers. Research on Social Work Practice, 8(5), 552-563.

O’Hare, T., Tran, T. V., & Collins, P. (2002). Validating the internal structure of the Practice Skills Inventory. Research on Social Work Practice. 12(5), 653-668.

Pedersen, P.B., Crethar, H.C., & Carlson, J.(2008).  Inclusive cultural empathy.  Washington, D.C.: American Psychological Association.

Regehr, C., Regehr G., Leeson, J., & Fusco, L. (2002). Setting priorities for learning in the field practicum, A comparative study of students and field instructors. Journal of Social Work Education, 38(1), 55-66.

Regehr, G., Freeman, R., Hodges B., & Russell, L. (1999). Assessing the generalizability of OSCE measure across continent domains. Academic Medicine, 74(12), 1320-1322.

Regehr, G., Regehr, C.,  Bogo, M., & Power, J. (2007). Can we build a better mousetrap? Improving the measures of practice performance in the field practicum. Journal of Social Work Education, 43(2), 327-342.

Reznick, R.K., Regehr, G., Yee, G., Rothman, A., Blackmore, D., & Dauphinee, D. (1998). High-stakes examinations: What do we know about measurement? Academic Medicine, 75(10), S97-S99.

Sloan, D. A., Donnelly, M. B., Schwartz, R. W., Plymale, M. A., Strodel, W. E, Kenady,…Bland, K. I. (1997). The multidisciplinary structured clinical instruction module as a vehicle for cancer education. American Journal of Surgery, 773(3), 220-225.

Tousignant, M. & DesMarchais, J.E. (2002). Accuracy of student self-assessment ability compared to their own performance in a problem-based learning medical program: A correlation study. Advances in Health Sciences Education, 7(1), 19-27.

Ventimiglia, J.A., Marschke, J., Camichael, P., & Loew, R. (2000). How do clinicians evaluate their practice effectiveness? A survey of clinical social workers. Smith College Studies in Social Work, 70(2), 287-306

Warf, B. C., Donnelly, M. B., Schwartz, R. W., Sloan, D. A. (1999). The relative contributions of interpersonal and specific clinical skills to the perception of global clinical competence. Journal of Surgical Research, 86, 17-23.

Wilkinson, T.J., Frampton, C.M., Thompson-Fawcett, M., & Egan, T. (2003). Objectivity in objective structured clinical examinations: Checklists are no substitute for examiner commitment. Academic Medicine, 78(2), 219-223.

Ytterberg, S.R., Harris, I.B., Allen, S.S., Anderson, D.C., Kofron, P.M., Kvasnicka, J.H., McCord, J.P., & Moller, J.H. (1998). Clinical confidence and skills of medical students: Use of an OSCE to enhance confidence in clinical skills. Academic Medicine, 73(10), S103-S105.