CP Logo

The University of Akron College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

The University of Akron Learning Outcomes

Assessment of student learning is an integral part of the measurement of the institution’s overall effectiveness. At UA, approaches to assessment of student learning are as diverse as our array of academic offerings.  At the university level, UA has established core competencies as the basis for the general education curriculum and has developed various strategies for determining how our students are doing in meeting these learning outcomes.  At the college and department levels, faculty, in consultation with professional accrediting bodies, industry advisors and others, use assessment processes that identify program objectives, outcomes and the instruments used for measuring these outcomes.

Learning Assessment Examples



The University of Akron administered the ETS Proficiency Profile in 2014 Fall - 2015 Spring.

The University of Akron conducted a Value-added administration of the ETS Proficiency Profile in 2014 Fall - 2015 Spring. The results are displayed below in the SLO Results tab.

For additional information on UA’s process for administering ETS Proficiency Profile, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the ETS Proficiency Profile for your institutional assessment?

The ETS Proficiency Profile was used to measure students’ ability in four core skill areas – critical thinking, reading, writing and mathematics

Data are used to measure learning outcomes and assess the school’s program effectiveness; to evaluate and inform teaching and learning, and pinpoint strengths and areas of improvement. Additionally, this knowledge helps drive program enhancements for improved student learning outcomes and to supplement learning outcomes assessment for selected disciplines as well as assessment outcomes.


Which The University of Akron students are assessed? When?

A sample of 400 freshmen and 300 seniors were selected to be tested. Seniors were tested beginning in fall semester 2014 and continuing into spring term 2015, and freshmen during spring semester 2015.

The freshman sample included students taking courses in Communications, Psychology, and Sociology classes and the senior sample of seniors taking courses in Business, Communications, and Sociology.


How are assessment data collected?

In fall 2014 and spring 2015, freshmen were sampled in Effective Oral Communication, Psychology and Sociology classes. A corresponding sample of seniors was drawn from 400-level classes in three disciplines: Biology, Business and Communication.  The design for the study was to obtain useful data to supplement program assessment in the selected disciplines. Thus seniors were selected from a group of disciplines that, when combined, would hopefully be representative of the entire UA senior population.


How are data reported within The University of Akron?

Reports will be generated using aggregate data and made available for review. University of Akron aggregate data will be comparison with other doctoral granting institutions in all four skill areas.


How are assessment data at UA used to guide program improvements?

After presenting results to senior administrators a determination will need to be made with regards to:  1) further analysis, 2) usefulness of the results, 3) dissemination of the results, and 4) future test administrations and protocol.


Of 6775 freshmen students eligible to be tested, 152 (2%) were included in the tested sample at The University of Akron.


Of 5562 senior students eligible to be tested, 216 (4%) were included in the tested sample at The University of Akron.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 47% 62% 47% 48%
Male 53% 38% 53% 52%
Other or Unknown <1% <1% <1% <1%
Race/
Ethnicity
US Underrepresented Minority 29% 28% 16% 19%
White / Caucasian 66% 68% 80% 76%
International 3% 3% 1% 1%
Unknown 3% 1% 2% 3%
Low-income (Eligible to receive a Federal Pell Grant) 46% 42% 42% 38%
Area of Study BCAS 16% 16% 29% 34%
CAST 24% 12% 15% 1%
CBA 7% 4% 12% 64%
CHP 13% 22% 19% 0%
EDUC 2% 3% 10% 1%
ENGR 12% 11% 15% 0%
UNC/NDS 26% 32% 0% 0%

Of freshmen tested, a higher proportion of females responded than exist in our student body and a lower proportion of men responded than exist in out student body, so we will review the results by gender to be sure there are not different patterns for men and women.

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the analytic writing task is at or near what would be expected at an institution testing students of similar academic abilities.