Download a PDF of this article

Abstract: This article presents initial results of an Institutional Review Board (IRB)-approved case study exploring ways students may benefit from completing a capstone project within field practicum and research course sequences. The capstone project consists of an evaluation research project developed and completed during the final two semesters of a student’s MSW program. To assess perceived benefits, the authors surveyed graduating students (N = 59) at the end of their year-long project (n = 39 respondents; response rate 66%). In addition, qualitative data was obtained from written self-assessment exercises (n = 14). Lessons learned can contribute to improving pedagogy and enriching students’ field experiences.

Incorporating agency-based research into social work courses can be challenging for students, classroom instructors, field instructors and field liaisons but can be of great value to agencies that lack staff capacity to conduct needed research evaluations and to students that need to build research skills. Graduate and undergraduate social work students may be unconvinced that they will need research skills in their professional practice, and agencies may be skeptical of using students to carry out such research. Concerns about social work education and research at all levels of study are not new (Fraser, Jensen & Lewis, 1993; Moore & Avant, 2008). Nonetheless, social work programs can enhance community engagement and strengthen their research curriculums by collaborating with field placement agencies, field faculty, and advanced-year graduate students (Hall, Casstevens, & Fisher-Borne, 2013; Harder, 2010). This article reports on a cohort of students’ perceptions of learning after they had completed evaluation studies as part of the collaborative field practicum/research sequence during their MSW program’s advanced year.

Much of the current literature on social work student research focuses on BSW students (Jacobson & Goheen, 2006; Kapp, 2006; Moore & Avant, 2008; Smith & Gore, 2006; Taliaferro & Ames, 2010; Tomkins, Rogers, & Cohen, 2009). At the graduate-student level, Morgenshtern, Freymond, Agyapong, and Greeson (2011) find that MSW students feel “intimidated and powerless” (p. 552) when it comes to conducting research. Morgenshtern et al. (2011) further report that students’ attitudes about research become more positive once they understand and experience the practical application of research within the profession.

Holley, Risley-Curtiss, Stott, Jackson, and Nelson (2007) report that MSW students attain greater knowledge and satisfaction when they are asked to perform hands-on research tasks rather than learn from textbook examples (p. 110). Harder’s (2010) study of a service-learning, research partnership model that utilizes agency data and involves presentations of analyses at host agencies supports Holley et al.’s conclusions. Anderson (2002) also suggests a community-based research model. A 2008 report for the Association of American College and Universities finds that high-impact learning opportunities, which include community-based learning, internships, and capstone projects, “appear to engage participants at levels that elevate their performance” (Kuh, 2008, p. 14). Undergraduate students participating in high-impact learning increased their aggregate grade point average during their freshman year (Kuh, 2008). McGill (2012) examines undergraduate capstone experiences, noting that “simply engaging students in high-impact educational practices does not necessarily equate to students achieving the desired outcomes” (p. 488). Kuh and McGill agree that to engage students in effective learning, high impact opportunities, such as capstone projects, must be “done well” (Kuh, p. 14).

Elsewhere, Hall et al. (2013) describe a year-long capstone project that was a collaboration between the advanced Field and Research sequences. This collaborative learning experience involves students, field supervisors and faculty, and research instructors in developing and implementing an evaluation project that culminates in a written evaluation report. During the graduate program, social work students complete a one-semester research course in their basic year before moving on to advanced-year work. During the first semester of their advanced year, graduate students work with their field instructors and research professors to design an evaluation proposal based on field agencies’ needs. After designing and obtaining approval for their project, students may begin to collect and analyze data. This may occur during the first semester of the year-long process; however, it is during the second semester that these tasks more commonly occur. Generally, the second semester involves data collection, analysis, and dissemination of results. This process culminates with a final report on the capstone project [for a detailed description of this collaboration, refer to Hall et al. (2013)]. To date, approximately 160 students have completed this capstone sequence.

This research and field collaboration developed to address the research competencies required by the 2008 Educational Policy and Accreditation Standards (EPAS) of the Council on Social Work Education (CSWE, 2008). These core competencies include critical thinking (e.g., Educational Policy 2.1.3) and evaluation (e.g., Educational Policies 2.1.6 and 2.1.10), which the capstone project captures. The EPAS also identify field education as social work’s signature pedagogy and state that the “intent of field education is to connect the theoretical and conceptual contribution of the classroom with the practical world of the practice setting” (CSWE, 2008, p. 8). The capstone project incorporates social work’s signature pedagogy into research-related active learning using program evaluation.

Method

A post-test-only approach was used in this pilot study. All students taking the research capstone course were invited to take part in the survey component of this study (N=59). The brief online survey was designed to assess students’ learning experiences within the research sequence over both the fall and spring semesters.

The authors used Qualtrics software to distribute a voluntary anonymous survey to graduating MSW students two weeks prior to graduation (n = 59). Nine surveys were opened but not completed and were not included in the analysis. A total of 39 of the 59 students responded to survey questions (a response rate of 66.1%). The survey did not require students to answer each question, was distributed at the end of the spring semester, and was open for a two-week period of time. No incentives were offered for survey completion. The survey was intentionally brief, with the aim of increasing student response rate during a time in the academic year that is quite busy. Table 1 presents the survey questions and response options. Descriptive statistical analyses were completed.

The research course section taught by the first author required students (n = 14) to complete a written self-assessment exercise at the end of the spring semester. The exercise was designed to help students reflexively assess their learning experiences in the research sequence across the fall and spring semesters. Questions included self-reflection on knowledge gained and skills and techniques acquired (refer to Table 2). Responses related to knowledge, skills, and techniques were extracted from the completed exercises and uploaded into ATLAS.ti software (2010) for data management and analysis. A content analysis was completed (Cole, 1988).

Results

Survey Results
Thirty-one students responded to the first survey question. Of these, 29 (93.5%) reported benefiting from the practical application of the capstone project. Four students added explanations of positive benefits obtained: (a) “I have a better understanding of how research is done and its potential value to an organization”; (b) “I absolutely loved being able to do ‘real world’ research, monitoring and evaluation. It was a need of a local agency, and I know they will benefit from it and were happy with the work we did”; and (c) “It made me feel like I truly earned the degree because it was the hardest part of graduate school.” Only two students (6.5%) reported not benefiting, with one student adding, “This project was useless. My facility didn’t even look at what I did.”

The second survey question asked students to identify all skills they learned from the practical application of the capstone project, offering eight choices, which included “other” (refer to Table 3). Thirty-one of 39 respondents (79.5%) reported they had learned how to conduct an evaluation study from start to finish; this was the most frequently checked item on the skills list. The least checked item (n = 20, 51.3%) was how to assess agency strengths and challenges with measuring program impact. The five skills identified under “other” were as follows: (a) “SPSS”; (b) “modest understanding of all”; (c) “what I don’t want to do with my career”; (d) “how to utilize statistical programs to analyze data”; and (e) “how to present findings to multiple levels of stakeholders.”

Thirty-six students responded to the third survey question. Of these, 11 (30.6%) definitely expected to use skills learned in their future professional work, and 17 (47.2%) expected they would probably do so. Five students (13.9%) reported being unsure, and only three students (8.3%) expected they would probably or definitely not use these skills in future professional work. The only student who provided an explanation reported, “I’m not interested in any professional role that will require research or evaluation.”

Self-Assessment Findings
Content analysis of the 14 self-assessment exercises identified five categories of growth related to knowledge and skill development. These self-assessment exercises were required as class assignments and the Institutional Review Board approved use of the de-identified data for qualitative analysis. Table 4 lists a selected quote representing each of these categories of knowledge and/or skill development.

Students described how critical the “real world” and “hands on” experience was for their learning, and learning to value, research skills. A number of students described their experience as one they could not have initially “imagined” and said that “learning in a real world setting outside of the classroom” allowed for a deeper connection with the work, each other, and their agency. Indeed, the word “experience” was the most frequently used word (with the exception of conjunctions) within the qualitative data. The second most common word used within the self-assessments was “do,” highlighting the applied aspect of the capstone project. One student shared that “newfound knowledge of the importance of ensuring that a program is being effective, efficient, and carrying out its mission has a real effect on how I view agencies and my future.”

Through seeing a direct benefit to an agency or community, a number of students reported a desire to continue to utilize research skills in their future careers, even if that work was primarily clinical in nature. As one student shared:

While I may want to practice clinical work to gain my license over the next few years, I do plan to do research and evaluation at the new agency I am at, because it is beneficial for all parties involved to know the difference a program is making.

The collaborative approach of situating research projects in field practicum settings appears to have helped students integrate this learning into their professional identities, as several students reported visualizing themselves continuing to use evaluation and research as social work practitioners.

Study Limitations

The content analysis was only conducted with one out of three sections of the research sequence. In addition, the data was obtained through a class assignment, which students submitted for a grade. As such, qualitative findings related to the self-assessment exercise can be expected to show strong positive responses due to social desirability bias. Nonetheless, the positive explanations provided in the quantitative survey are consistent with students responding to the self-assessment, who reported a positive association with the research sequence. At least one student who responded to the quantitative survey intends to avoid professional positions involving research.

The lack of a pre-test makes it impossible to establish change-over-time or to attribute survey results solely to students’ collaborative capstone experience. The authors are also aware that attitude does not necessarily point to competency. This study is limited in that it explores student perceptions without attempting to measure core competencies (CSWE, 2008); the relevance of student attitudes to practice is also outside the scope of this study. The overall impact of the collaborative capstone project on students’ learning experiences, however, appears to have been positive.

Discussion and Implications

The survey had a 66.1% response rate, and 93.5% of respondents reported benefitting from the practical application of the capstone project, while only 6.5% of respondents reported not benefitting. Earlier studies (Epstein, 1987; Jacobson, & Goheen, 2006; Taliaferro & Ames, 2010) suggest “distaste” is a common response from social work student in regards to research (Taliaferro & Ames, p. 106). Given previous findings on student attitudes towards research, this study’s response rates and lone response indicating disinterest in “any professional role that will require research or evaluation” suggest that the capstone project had an overall positive effect in this area. The capstone project’s experiential, collaborative approach to learning across the research and field curriculum sequences may help to engage potentially “research reluctant” students (Epstein, 1987; Green, Bretzin, Leininger, & Stauffer, 2001; Taliaferro &Ames, 2010). Our findings, though preliminary, suggest that the applied nature of evaluation capstone work within student field placements shows promise for graduate research education.

Social work field education programs using a capstone project or similar model can provide important experiential learning for students who are reluctant and apprehensive when it comes to research, while simultaneously providing research knowledge and skills to agencies. As the profession continues to understand and embrace a culture of accountability, more and more agencies need social workers able to synthesize information and data in order to evaluate and impact agency outcomes. The results from this study imply that agency-based evaluations done by MSW students may have multiple benefits. Consistent with Harder’s (2010) findings, in research partnership models students were able to develop evaluation projects for which they took ownership. Additional research is needed to understand the impact of such models on agencies and how classroom instructors may impact overall outcomes.

Increasingly, well-prepared social workers are those that not only have excellent direct practice skills, but also macro-level skills upon which agencies can draw to fulfill their missions and funding mandates. As agency budgets tighten and shrink, there is reduced capacity to evaluate programs or conduct agency-based research. The capstone project used a learning approach that was incremental (Pan & Tang, 2004) in that the engagement, learning, and application occurred over an entire academic year. Additionally, this study used a model in which the agency-based evaluation conducted by the student involved support from field liaisons, field instructors, and classroom instructors. There was co-learning, as each partner in the process learned from the other. Overwhelmingly, student respondents expressed that they are confident in their ability to conduct research from start to finish. Student respondents for the most part also reported that they expect to utilize their research skillset in their future social work career. Important follow up will involve the investigation of agency perspectives on the value of research and evaluation within their respective organizations. Future research involving pre-post evaluations to investigate both student attitudes and practice competencies is warranted.


Anderson, S. G. (2002). Engaging students in community-based research: A model for teaching social work research. Journal of Community Practice, 10, 71–87. doi:10.1300/J125v10n02_05

ATLAS.ti Version 6.0. (2010). [Computer software]. Berlin: Scientific Software Development.

Hall, J. K., Casstevens, W. J., & Fisher-Borne, M. (2013). The graduate field program and capstone evaluation project. Field Educator, 3(2). Retrieved from http://fieldeducator.simmons.edu/article/the-graduate-field-program-and-capstone-evaluation-project/#more-1742

Cole, F. L. (1988). Content analysis: Process and application. Clinical Nurse Specialist, 2(1), 53–57./p>

Council on Social Work Education (CSWE). (2008). Educational policy and accreditation standards. Retrieved from http://www.cswe.org/file.aspx?id=13780

Epstein, I. (1987). Pedagogy of the perturbed: Teaching research to the reluctants. Journal of Teaching in Social Work, 1(1), 71–89. doi:10.1300/J067v01n01_06

Fraser, M. E., Jensen, J. M., Lewis, R. E. (1993). Research training in social work: The continuum is not a continuum. Journal of Social Work Education29(1), 46-62. doi:10.1080/10437797.1993.10778798

Green, R. G., Bretzin, A., Leininger, C., & Stauffer, R. (2001). Research learning attributes of graduate students in social work, psychology, and business. Journal of Social Work Education, 37(2). 333–341. doi:10.1080/10437797.2001.10779058

Harder, J. (2010). Overcoming MSW students’ reluctance to engage in research. Journal of Teaching in Social Work, 30(2), 195–209, doi:10.1080/08841231003705404

Holley, L. C., Risley-Curtiss, C., Stott, T., Jackson, D. R., & Nelson, R. (2007). “It’s not scary”: Empowering women students to become researchers. Affilia, 22(1), 99–115. doi:10.1177/0886109906295812

Jacobson, M., & Goheen, A. (2006). Engaging students in research: A participatory BSW program evaluation. Journal of Baccalaureate Social Work, 12(1), 87–104.

Kapp, S. A. (2006). Bringing the agency to the classroom: Using service learning to teach research to BSW students. Journal of Baccalaureate Social Work, 12(1), 56–70.

Kuh, G. D. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. Washington, DC: Association of American Colleges and Universities.

McGill, P. T. (2012). Understanding the capstone experience through the voices of students. The Journal of General Education, 61(4), 488–504. doi:10.1353/jge.2012.0029

Moore, L. S., & Avant, F. (2008). Strengthening undergraduate social work research: Models and strategies. Social Work Research, 32, 231–235. doi:10.1093/swr/32.4.231

Morgenshtern, M., Freymond, N., Agyapong, S., & Greeson, C. (2011). Graduate social work students’ attitudes toward research: Problems and prospects. Journal of Teaching in Social Work, 31(5), 552–568. doi:10.1080/08841233.2011.615287/p>

Pan, W., & Tang, M. (2004). Examining the effectiveness of innovative instructional methods on reducing statistics anxiety for graduate students in the social sciences. Journal of Instructional Psychology, 31, 149–159.

Smith, R. D., & Gore, M. T. (2006). Bringing research to life: Using social work students in a statewide foster care census. Journal of Baccalaureate Social Work, 11(2), 78–87.

Taliaferro, J. D. & Ames, N. (2010). Implementing and Elective BSW community-based evaluation research course. Journal of Baccalaureate Social Work, 15(1), 105–119.

Tompkins, C., Rogers, A., & Cohen, H. (2009). Promoting undergraduate student research collaboration: Faculty perceptions of benefits and challenges. Journal of Baccalaureate Social Work, 14(1), 1–13.