Assessment at USCA is the ongoing process of self-improvement by analyzing and evaluating academic programs as well as university services through a variety of methods and measurements. Each academic and administrative unit implements an ongoing assessment program that clearly articulates goals and objectives, measures these outcomes on a regular basis at several times and in multiple ways, analyzes the findings, and uses the results for improvements and adjustments in both services and curriculum.
University of South Carolina Aiken administered the ETS Proficiency Profile in Fall 2014 - Spring 2015.
University of South Carolina Aiken conducted a Senior-only benchmarked administration of the ETS Proficiency Profile in Fall 2014 - Spring 2015. The results are displayed below in the SLO Results tab.
For additional information on USCA’s process for administering ETS Proficiency Profile, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.
The Proficiency Profile (formerly MAPP) was selected as one of multiple nationally normed measures of students' attainment of general education competencies. It was also selected to measure the critical thinking skills of entering and exiting students as a value-added measure associated with the institution's Quality Enhancement Plan (QEP). As a result, the data from the ETS Proficiency Profile were extant and available for use with the VSA College Portrait.iollege Portrait SLO reporting as well Proficiency Profile data were extant and available for use in College Portrait SLO reporting as well Proficiency Profile data were extant and available for use in College Portrait SLO reporting as well It was also IProficienc
Each year, a stratified and representative random sample of approximately 200 entering freshmen from the Fall are selected to complete the Proficiency Profile. Smilarly, a representative stratified sample of about 200 exiting seniors are evaluated each academic year.
Students are contacted by the Assessment Coordinator in the Office of Institutional Effectiveness and are provided several optional times to complete the proctored test. Participants are informed that they are serving as representatives of their class. As an incentive, all test participants are provided coupons for free or discounted food at local restaurants, and are entering into a drawing for one of three free pizzas given each semester.
Each program of study must collect assessment data on an annual basis to determine if student learning outcomes are being met. In addition to assessments collected at a departmental level, the institution collects data from standardized or nationally normed tests through the Office of Institutional Effectiveness. These data are then shared with standing committees such as the General Education Committee, the Academic Assessment Committee, and the Strategic Planning Committee. These groups are recommending bodies and are authorized to develop and recommend adoption of new curricular strategies to ensure student learning outcomes are achieved.
At USC Aiken, faculty are responsible for evaluating assessment data and implementing programmatic changes to effect improvements. On a cyclic basis, departments and schools meet with standing committees such as the General Education Committee and the Academic Assessment Committee to discuss General Education and programmatic student learning outcomes and the extent to which current curricular strategies are effective. Formal curriculum changes such as new courses or course sequences are sometimes implemented. Other times, informal changes such as having faculty emphasize or spend more time on specific learning objectives within an existing course are adopted.
Our samples of approximately 200 entering and 200 exiting students are highly representative of the freshman and senior classes. A stratfied random sampling procedure is employed that proportionally matches the sample to the student population in terms of gender and race/ethncity.
The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base
At USCA, senior students who completed the ETS Proficiency Profile Critical Thinking test (n=195) scored higher than 26% of seniors at all other ETS Proficiency Profile-participating institutions in Spring Fall 2014.
At USCA, senior students who completed the ETS Proficiency Profile Written Communication test (n=195) scored higher than 29% of seniors at all other ETS Proficiency Profile-participating institutions in Spring Fall 2014.
As USCA did not participate in a value-added administration, scores are not adjusted to account for the incoming ability of USCA students.
The chart below shows the distribution of student scores on the ETS Proficiency Profile Critical Thinking test. Students are scored as Not Proficient, Marginal, or Proficient.
The charts below show the distribution of student scores on the three levels of the ETS Proficiency Profile Writing Test. Students are scored as Not Proficient, Marginal, or Proficient on each level. Writing 3 represents more advanced writing skill than Writing 2, which represents more advanced writing skill than Writing 1.