Examples of How Assessment of Student Learning Is Leading to Curricular Change

History, Dietrich School of Arts and Sciences

Among the learning outcomes that the Department of History identified are that students will attain mastery at writing a sustained piece of formal, analytical prose and that they will demonstrate expertise in conceptualizing, investigating, and discussing history as a subject of intellectual inquiry. In academic year 2007–08, the department assessed a sample of final papers written in the capstone seminar using a rubric developed by members of the department’s undergraduate committee. Only 45 percent met the standard of capable or better in mastery of writing; only 40 percent met the standard of capable or better in expertise. The findings from the 2008 assessment process in regard to these two learning outcomes helped to fuel the department’s decision to revamp the undergraduate major in history, doubling the number of required writing seminars and redesigning those seminars so that they function together asa skill-building sequence. Implementation of the new major, however, was put on hold in 2009 due to budget constraints. In 2010, assessment results suggested that the process of intensive discussion of pedagogy that accompanied the curricular redesign had succeeded in “bringing up the floor,” reducing the proportion of marginal essays, although the number of essays judged proficient still fell short of goals. Recent hires have now made it possible to accelerate implementation of the revised major, which the department believes will make a major impacton students’ opportunities to develop the skills in expository writing and historical conceptualization central to these two learning outcomes. (see Appendix C5: Arts and Sciences Undergraduate Assessment Matrices for History, 2008 and 2010.)

Actuarial Mathematics, Dietrich School of Arts and Sciences

Similarly, in academic year 2009–10, the faculty of the actuarial mathematics program assessed their students’ ability to think critically and solve problems. The results were reviewed by the undergraduate committee, program faculty, and the department chair, and they were used as a guideline for modification of the content and pace of instruction. With input from graduating students and the Actuarial Mathematics Advisory Board, an updated and more streamlined curriculum was developed that includes one new course and a strengthening of other courses aimed at the content in which students were not meeting the expected standards. A formal liaison was established to improve coordination with the statistics department. The strengthened curriculum and new course was put in place in academic year 2010–11, and the department is looking forward to assessing how this new approach helps to prepare students to better meet the standards of the actuarial exam when they take it in 2012. (See Appendix C5: Arts and Sciences Undergraduate Assessment 2010, “Actuarial Math Matrix May 2010.”)

Criminal Justice, Pitt–Greensburg

In reviewing senior papers as part of the assessment of student learning in the criminal justice program at Pitt–Greensburg, the faculty found that students were not achieving the expected levels of competence in identifying a feasible problem, framing a research question interms of independent and dependent variables, dealing with methodological issues, and communicating conclusions. To address these deficiencies, faculty introduced a requirement that criminal justice majors take a course in research methods. It is hoped that subsequent assessments will indicate improvements. (See Appendix C5: UPG 2010, “Update on Assessment Pitt–Greensburg May 2010,” pp. 39–49.)

Biology, Pitt–Bradford

Through their assessment of student learning, the biology faculty found weaknesses in students’ ability to correctly define, explain, and describe the basic concepts of biology and to effectively communicate scientific information both verbally and in writing. The faculty decided to increase the rigor in the introductory biology and sophomore biology courses and to offer a senior seminar in which all students will be required to make oral presentations for which they will receive feedback from faculty about content and delivery. (See Appendix C5: UPB2010, “UPB Table of Contents and Matrix,” pp. 15–18.)

Industrial Engineering, Swanson School of Engineering

One ABET-directed expected learning outcome is that students attain “a recognition of the need for, and an ability to engage in, lifelong learning.” One way of assessing this outcome is to ask questions that are included on the University’s Student Opinion of Teaching survey. Student responses to these questions are monitored by the undergraduate program director and reported to department faculty so that strengths and areas of concern can be identified. In 2008, the Department of Industrial Engineering (IE) concluded that its students were not demonstrating significant improvement in attainment of this outcome. To address this and other specific learning outcomes, in fall 2008, the required undergraduate seminar for IE majors was revised to require students to write and submit a career plan, and speakers were brought in to specifically address career and professional skill areas. Subsequent assessments of this learning outcome have demonstrated improvement as a result of the revised undergraduate seminar and other program changes. (See Appendix C1, pp. 31–34.)

Graduate Programs

All graduate programs also have in place programs to assess student learning and use the outcomes to further enrich the curriculum. In all cases, these learning outcomes are developed by the program faculty, and in most cases, the process is overseen by an associate dean for graduate studies, or the equivalent, who ensures that the process is being implemented appropriately and provides feedback on the assessment plans of the faculty, sometimes in consultation with a faculty committee. These plans also are included in the overall review conducted by the Provost’s office through the vice provost for graduate studies.The plans for the individual programs can be found in Appendix C5, but it is worth highlighting a few common aspects of the assessment of student learning in doctoral programs. A key aspect of doctoral training is that all graduates are expected to be able to conduct original research that advances their discipline. For individual students, achievement of this learning outcome is assessed at the dissertation defense, at which representatives of the faculty review the student’s work. As part of the assessment of student learning initiative, some schools have developed rubrics that are now used by faculty as part of the overall review of the defense. Other programs have assessed this learning outcome by collecting and reviewing data on publications by their graduates both while they are students and upon graduating. Still others look at the levels of research funding their students are able to attract, again both while they are students and post graduation.