CP Logo

Temple University College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

Temple University Learning Outcomes

Student learning is at the core of Temple's mission. Essential to measuring the university's overall effectiveness is the assessment of student learning and student success.

At Temple, approaches to assessing student learning are as diverse as our academic offerings. At the University level, for example, Temple has established eight core competencies for General Education, and has defined a number of strategies designed to determine success in these outcome areas.

At the school, college, and department level, programs develop and implement assessment plans and processes that evaluate program goals, measure student learning, communicate major findings, and describe continous improvements.




Temple University administered the ETS Proficiency Profile in 2013 - 2014.

Temple University conducted a Value-added administration of the ETS Proficiency Profile in 2013 - 2014. The results are displayed below in the SLO Results tab.

For additional information on TU’s process for administering ETS Proficiency Profile, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the ETS Proficiency Profile for your institutional assessment?

Temple chose ETS Proficiency Profile as the VSA institutional assessment measure for a number of reasons. The test offers flexibility in delivery method, administration method, and test length while demonstrating strong reliability and validity.


Which Temple University students are assessed? When?

Incoming freshman and graduating senior students are offered the chance to participate in the Student Learning Outcomes project on a three-year cycle. Two hundred freshmen tested in both the fall of 2010 and 2013. Two hundred graduating seniors tested shortly before graduation during spring 2011, with the next wave of assessment scheduled for spring 2014. Transfer students are not eligible to participate in the assessment.


How are assessment data collected?

Assessment of student learning includes both direct and indirect measures. Program effectiveness and curricula are examined through a cycle of periodic program reviews and annual assessment reports. Student learning outcomes are measured broadly using the ETS Proficiency Profile as well as locally designed and managed assessment in our academic programs and for GenEd. Course and teaching evaluations are used to assess and improve instructional delivery. Various national and local surveys are used to assess curricula, student support services, and new educational initiatives. 


How are data reported within Temple University?

After the 2010-2011 Student Learning Outcomes pilot testing of freshmen and seniors was completed, Temple submitted the SAT or ACT scores of participants to ETS. Scores were loaded into an ETS-created regression algorithm that predicts Proficiency Profile scores of freshmen and seniors. The difference between predicted and actual Proficiency Profile scores for both cohorts is calculated, and learning gains are determined by subtracting the standard difference for freshmen from the standard difference for seniors. The results of the learning gains report were widely disseminated throughout the Temple community.


How are assessment data at TU used to guide program improvements?

Stakeholders use the results of learning assessments to change the learning environment in many ways, including curricular modifications, pedagogical modifications, assessment approach modifications, outcomes modifications, and new ways of discussing and sharing assessment results.

 


Of 4219 freshmen students eligible to be tested, 200 (5%) were included in the tested sample at Temple University.


Of 2899 senior students eligible to be tested, 200 (7%) were included in the tested sample at Temple University.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 51% 67% 51% 67%
Male 49% 34% 49% 33%
Other or Unknown <1% <1% <1% <1%
Race/
Ethnicity
US Underrepresented Minority 30% 38% 30% 34%
White / Caucasian 58% 57% 58% 58%
International 3% 3% 3% <1%
Unknown 9% 2% 9% 8%
Low-income (Eligible to receive a Federal Pell Grant) <1% <1% <1% <1%

For the 2010-2011 Student Learning Outcomes pilot, our tested students included a higher proportion of females than exist in our student body. In terms of race and ethnicity, white students were slightly underrepresented, while Asian and Black students were slightly overrepresented.

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the analytic writing task is above what would be expected at an institution testing students of similar academic abilities.


Critical Thinking Detail

The chart below shows the distribution of student scores on the ETS Proficiency Profile Critical Thinking test. Students are scored as Not Proficient, Marginal, or Proficient.



Written Communication Detail

The charts below show the distribution of student scores on the three levels of the ETS Proficiency Profile Writing Test. Students are scored as Not Proficient, Marginal, or Proficient on each level. Writing 3 represents more advanced writing skill than Writing 2, which represents more advanced writing skill than Writing 1.