Kenneth P. Dietrich School of Arts and Sciences

The Dietrich School of Arts and Sciences i sa very large component of the University, comprising 31 academic departments and offering instruction to more than 12,000 undergraduate and graduate students. According to Bettye J. and Ralph E. Bailey Dean of Arts and Sciences N. John Cooper, the overall long-term goal of the Dietrich School is excellence in scholarship, undergraduate education, and graduate education that drives the University’s reputation as a world-class institution of higher education.

Planning, Assessment, and Links to Institutional Goals

The dean presides over the Dietrich School’s Planning and Budgeting Committee (PBC). PBC reviews the Provost’s response to the previous year’s plan, which includes feedback from the Provost and the Provost’s Area Planning and Budgeting Committee (PAPBC). The annual cycle of reporting, receiving feedback, and reviewing that feedback provides the framework for continuous assessment for improvement, which has become part of the academic culture within the Dietrich School.

While the Dietrich School adopted the new annual planning template when preparing its fiscal year 2008 annual plan, in that first year, the Provost was not satisfied that the school had taken advantage of the strengths of the template format. The PAPBC review of the fiscal year 2008 plan indicated that the plan included too much information about accomplishments from previous years, few clear goals within each strategic priority for the coming year, and few measurable outcome indices for many goals.

The Dietrich School was responsive to this feedback, as reflected in the Provost’s response to the plan submitted for fiscal year 2009: “Your fiscal year 2009 plan is a wonderful follow-up to the vastly improved plan submitted for fiscal year 2008. Your goals, strategic priorities, and activities match those from fiscal year 2008, and each is updated to indicate actions accomplished.”

Despite steady improvement of the Dietrich School in using the new template matrix for its annual plans, the PAPBC review of the fiscal year 2010 plan indicated a continued need to identify specific and measurable targets, separate long-term planning from annual planning, indicate which goals were being met, and speculate on the implications of not meeting certain goals. The following year, the PBC review committee commented that compliance on these points was much improved in the fiscal year 2011 plan.

Using Assessment in Planning, Program Development, and Resource Allocation

While the format of assessment has been evolving, the culture of planning and assessment also has been growing in the Dietrich School. The following examples illustrate how the Dietrich School has used planning and assessment for sustained and effective change.

Central to recent developments in the Dietrich School has been the consistent control of faculty numbers through a system of target numbers for tenured/tenure-stream faculty in each department, based originally on a PBC analysis in the mid-1990s, as adjusted subsequently through four major routes: rightsizing of the faculty for fiscal year 1998 based on an agreed target enrollment of undergraduates, an expansion of the faculty for fiscal year 2005 based on an agreed increase in the target enrollment of undergraduates, academic initiative funding to build on/modernize scholarly strengths in alignment with University strategic goals, and a budget reduction for fiscal year 2010 positioning the University to deal with instability and reduction in the commonwealth appropriation.

Because one consistent goal for the Dietrich School has been to have a larger number of departments ranked more highly in peer evaluations, such as the 1995 National Research Council (NRC) evaluations of doctoral programs, changes in faculty targets for departments have been keyed to that goal. Almost all new faculty positions have been investments in target departments that include anthropology, biological sciences, chemistry, economics, English, history of art and architecture, Hispanic languages and literatures, history, history and philosophy of science, mathematics, music, neuroscience, physics and astronomy, political science, and psychology.

In 2010, when the NRC rankings were released, most of the targeted programs in the Dietrich School had advanced from where they were in 1995, some showing marked improvement even in comparison to the very different system used in the previous study. This study reflected an unprecedented collection of data on research doctorate programs using a very complex, and somewhat controversial methodology, and the University has been sorting through, interpreting, and incorporating some of this information into its own benchmarking processes.

Further evidence of the success of this strategic approach to assessment and faculty hiring is the Dietrich School component of the Provost’s nanoscience initiative. In this initiative, six new positions, three each in chemistry and in physics and astronomy, were coordinated with then current and projected investments in research laboratory modernization and renovation under the University’s 12-year facilities plan to optimize the faculty profiles in these core sciences and to take advantage of scientific and funding opportunities in nanoscience. Assessment of success in sciences with access to competitive federal research sponsorship can be tracked through sponsored research expenditures, which in the Dietrich School have increased from $31.1million in fiscal year 2000 to about $50 million in fiscal year 2011.

To cite another case, the Dietrich School has been committed to improving advising, particularly because academic advising plays such a critical role in the lives of undergraduates. In the late 1990s, the Provost’s office began administering a student satisfaction survey to assess all aspects of the undergraduate experience on the Pittsburgh campus. Results of the first round of surveys showed low satisfaction with academic advising, particularly the advising provided to freshmen and undeclared majors through the central Advising Center. In addition, a review of the administrative data indicated that too many students were being advised through the Advising Center for three or four years because they had not declared a major.

In June 2000, the Dietrich School completed an external review of its advising function that identified areas for improvement. The Advising Center was restructured, and the Dietrich School devoted several years of successful efforts to professionalizing its Advising Center, including an annual review and assessment. In subsequent years, student satisfaction increased and more students declared majors at the appropriate time. In the late 2000s, improvements in student satisfaction with advising started to slow at the same time that the University was converting to a new student data system (PeopleSoft) that allowed for student self-registration. At this time, the Dietrich School recognized that the role of the academic advisor would change in an era of student self registration and sought to take advantage of this administrative change to further strengthen advising. The Dietrich School again engaged external experts to conduct a comprehensive review of the Advising Center that resulted in a number of improvements that were adopted in 2010 (Appendix C25).

The impact of assessment on planning, programs, and resource allocation also is evident in department-level processes. In response to feedback from the Provost, the Dietrich School now asks departments to formulate plans that are aligned with the overall Dietrich School plan. In conjunction with Organization Development in the Office of Human Resources, Dietrich School administrators worked to develop a departmental strategic planning process in 2007–08 that included extensive data reporting. This process is being fully integrated with an external review process that occurs every 10 years and involves three or four faculty members from outside institutions who perform a site visit. The internal process expects departments to use comments from external reviewers as input for constructions of five-year plans explicating a mission statement consistent with the missions of the Dietrich School and the University, short- and long-term goals for improvement, metrics for assessing outcomes, and a detailed timeline for implementation.

Each discipline in the Dietrich School can interpret assessment within its own framework, but some common tools include capstone course evaluations, course-embedded assessment, standardized tests, portfolio assessment, and surveys. A professor in the Department of History of Art and Architecture noted that, in the gateway courses, assessment changed her approach to “a fluid, dynamic one that is responsive every year to assessment; the syllabi and assessment rubric are amended every year.” Using the outcomes-based assessment approach, the political science department totally revised its major and its honors program and instituted rigorous capstone seminars. The result was a significant increase in the number of political science majors. (See the expanded discussion in the Assessment of Student Learning Outcomes section.)

Benchmarking Data in the Assessment Process

Benchmarking against other institutions has been a challenge for the Dietrich School because, unlike the professional schools, the composition of disciplines that make up colleges and schools of arts and sciences are different at each institution, and there is no formal association through which they share data.

While the Dietrich School regularly collects, analyzes, and uses internal benchmarking data, the collection and use of external benchmarking data have been less systematic. In its fiscal year 2001 annual plan, the Dietrich School described a plan to benchmark against peer and aspirational peer institutions. However, the fiscal year 2002 plan reported a lack of available external benchmarking data from comparable arts and sciences programs at peer or aspirational peer institutions and proposed instead a continued reliance on internal data. In his response, the Provost suggested that the Dietrich School work more diligently to find a way to obtain and use external benchmarking. This pattern—reports of little external benchmarking data availability followed by the Provost’s recommendation to strengthen external benchmarking—continued for several years.

Despite this situation, the Dietrich School has made strides in obtaining and using external benchmarking data to drive its decision making. As part of the strategic planning process, departments are asked to obtain external benchmarking data at peer and aspirational peer departments and programs. These data are incorporated into the departmental self-study and provided to external reviewers. For example, the Departmentof History of Art and Architecture used external benchmarking to develop an improved plan for managing the University Art Gallery.

Benchmarking data also are being used at the school level. For example, a 2005–06 review of cross-institutional benchmarking data included an “endowed chair” analysis. As reported in the Dietrich School’s fiscal year 2007 annual plan, the University of Pittsburgh needed to create at least 20 new endowed chairs in order to compete with its aspirational peer institutions. The Dietrich School has secured support for five new endowed chairs as part of the University’s capital campaign and has the goal of targeting10 more endowed chair positions over the next five years. In this example, benchmarking data led to the development of a strategic plan to increase targets for voluntary support, and a portion of that increased support was earmarked for creation of additional endowed chair positions.

Improving and Refining a SustainableAssessment Process

The Dietrich School has improved its external and self-assessment processes over the last 10 years. Annual plans now more clearly articulate specific goals and more carefully assess progress toward those goals. The plans are more focused and are more consistent with a continuous improvement model. Department-level planning is becoming more formalized. Data provided by departments and programs within their annual reports to the dean are used to help allocate resources across Dietrich School departments. These departmental and program data are presented in summarized form within the Dietrich School annual plan submitted to the Provost. Further, the departmental strategic planning process itself was assessed in 2010, resulting in a more streamlined and focused process.

The Dietrich School has increased its use of internal benchmarking data over time and has made progress toward better use of external benchmarking data. Nevertheless, there is room for improvement. For example, the Provost’s feedback to the Dietrich School after reviewing the fiscal year 2009 plan indicated a continued need for more accurate and interpretable measures of progress. To this end, the University has purchased a national database, Academic Analytics, which provides objective measures of productivity for faculty and PhD-granting departments. This database will augment and enhance existing self-assessment capabilities at the department level as the departments learn to use it well.

In summary, the Dietrich School—the largest and most complex of the representative sampling of units chosen—has built a culture of assessment that has had an observable and sustained effect at the school level and that increasingly is being embraced by departments, which are using it as an opportunity to strengthen their programs.