CP Logo

University of Pittsburgh - Pittsburgh Campus College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

University of Pittsburgh - Pittsburgh Campus Learning Outcomes

The University of Pittsburgh has a commitment to excellence in instruction as evidenced by our culture of assessment through which we continually evaluate the success of our educational programs and feed the results of those assessments back into our academic planning processes. Each school’s and campus’ student learning outcomes are consistent with the University’s goals for all of our graduates, namely that our students will be able to think critically and analytically, gather and evaluate information effectively and appropriately, understand and be able to apply basic, scientific and quantitative reasoning, communicate clearly and effectively, use information technology appropriate to their discipline, exhibit mastery of their discipline, understand and appreciate diverse cultures (both locally and internationally), work effectively with others, have a sense of self and responsibility to others. 




University of Pittsburgh - Pittsburgh Campus administered the CLA+ in 2014 - 2015.

University of Pittsburgh - Pittsburgh Campus conducted a Value-added administration of the CLA+ in 2014 - 2015. The results are displayed below in the SLO Results tab.

For additional information on PITT’s process for administering CLA+, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the CLA+ for your institutional assessment?

The University of Pittsburgh started administering the Collegiate Learning Assessment (CLA) in 2006-2007 as part of our efforts to measure and improve the student experience. The CLA allows us to measure the overall growth and expected growth of broad cognitive skills such as problem solving, analytical reasoning, critical thinking, and analytical writing. These data provide faculty and administrators information that they can use to improve student learning. 


Which University of Pittsburgh - Pittsburgh Campus students are assessed? When?

The CLA is currently administered at the University of Pittsburgh every other year to freshmen in the fall semester and to graduating seniors in the spring semester.  These freshmen and seniors are full-time, have SAT scores, and did not transfer more than 9 credits from another university to ours. The CLA was administered continuously from 2006-2007 through 2012-2013.


How are assessment data collected?

To collect data, an invitation to participate in the CLA is sent to a simple random sample of freshmen in the fall and to seniors in the spring from both the Vice Provost of Undergraduate Studies and the Dean of Students. Those students who complete the CLA are given a gift certificate for a Pitt sweatshirt from the campus bookstore. In fall 2014, 109 freshmen completed the test and in spring of 2015, 117 seniors completed the test. The Council for Aid to Education who administers the CLA recommends that schools test at least 100 students. 


How are data reported within University of Pittsburgh - Pittsburgh Campus?

The Council for Aid to Education prepares an institutional report. It is reviewed by the Office of the Provost and the Enrollment Management Committee. This committee is co-chaired by the Vice Provost of Undergraduate Studies and the Dean of Students and includes members from the associate deans of the undergraduate schools, the Chief Enrollment Officer, the University Registrars and representatives from housing, residence life, career development and placement center, and institutional research.


How are assessment data at PITT used to guide program improvements?

These data provide input on the critical thinking and inquiry skills of our students which is considered within our overall assessment of student learning. We use multiple data sources, including additional surveys, retention and graduation rates, and student learning outcome assessment with the individual academic programs, to inform program improvements.


Of 3686 freshmen students eligible to be tested, 109 (3%) were included in the tested sample at University of Pittsburgh - Pittsburgh Campus.


Of 3916 senior students eligible to be tested, 117 (3%) were included in the tested sample at University of Pittsburgh - Pittsburgh Campus.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 53% 64% 49% 50%
Male 47% 36% 51% 50%
Other or Unknown <1% <1% <1% <1%
Race/
Ethnicity
US Underrepresented Minority 22% 17% 19% 23%
White / Caucasian 74% 82% 77% 76%
International 3% <1% 3% <1%
Unknown 1% 1% 1% 1%
Low-income (Eligible to receive a Federal Pell Grant) 15% 17% 14% 14%

Our freshmen sample included a higher proportion of women and a lower proportion of underrepresented minorities than exist in our student body. However, our senior sample is quite representative of our student body.

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the selected-response questions is at or near what would be expected what would be epxected at an institution testing students of similar academic abilities.

Seniors Detail

The charts below show the proportion of tested seniors who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 620.0
Critical Reading & Evaluation (Range: 200 to 800) 599.0
Critique an Argument (Range: 200 to 800) 603.0
Freshmen Detail

The charts below show the proportion of tested freshmen who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 593.0
Critical Reading & Evaluation (Range: 200 to 800) 585.0
Critique an Argument (Range: 200 to 800) 585.0