CP Logo

North Carolina State University College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

North Carolina State University Learning Outcomes

Undergraduate programs, transcripted certificates offered for credit, and co-curricular units participate in the student learning assessment process. The requirements are flexible enough to allow programs and/or colleges wide latitude in implementing the procedures, including integration of undergraduate and graduate program review where desired.

Assessment activities continue to be centered in the programs and are facilitated by the colleges and divisions. The deans in the colleges and Dean for the Division of Academic and Student Affairs (DASA) are the central figures in managing both the ongoing assessments of student learning outcomes and the eight-year self-studies, with final oversight being the purview of the Provost. Each year, the department heads will collect reports from each undergraduate academic program and transcripted certificate, write a summary for the college and the co-curricular units, and submit reports to the dean for their college. Deans provide feedback and then submit college-level summaries to the Provost. 

The undergraduate programs, certificates and co-curricular units are aided in their efforts by the Division of Academic and Student Affairs (DASA), the Office of Institutional Research and Planning (OIRP), the Undergraduate Academic Assessment and Comprehensive Program Review Steering Council, the Graduate School and assigned university consultants. 

Basic expectations for the undergraduate assessment process:

1. Each program, transcripted certificate and co-curricular unit must have a set of comprehensive student learning outcomes which are measurable (i.e., need to use action verbs such as those found in the Bloom’s tables online) and can all be assessed within a 3 to 5 year cycle.

2. Each program, transcripted certificate and co-curricular unit must use direct measures of learning that are aligned with the outcomes such as test questions (not grades) or projects with a rubric (or other method that will allow for systematic review of the course product) from upper level courses. When done well and in the aggregate, these methods will allow programs to determine not only if the students achieved the outcome, but will allow faculty to identify where there are both strengths and weakness for the program. Each program can measure as many outcomes as they deem appropriate each year as long as they are all assessed within 3 to 5 years.

3. Each program, transcripted certificate and co-curricular unit must make clear decisions based on the data collected. The spirit of the process is that the faculty review the data and make decisions regarding whether changes are needed and if so, what those changes should be.

It is recommended that each program, transcripted certificate and co-curricular unit complete a curriculum map to help identify the courses in which outcomes are addressed and where the best direct evidence can be collected.

North Carolina State University administered the ETS Proficiency Profile in 2010 - 2015.

North Carolina State University conducted a Senior-only benchmarked administration of the ETS Proficiency Profile in 2010 - 2015. The results are displayed below in the SLO Results tab.

For additional information on NC State’s process for administering ETS Proficiency Profile, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.




North Carolina State University administered the ETS Proficiency Profile in 2015 Spring.

North Carolina State University conducted a Senior-only benchmarked administration of the ETS Proficiency Profile in 2015 Spring. The results are displayed below in the SLO Results tab.

For additional information on NC State’s process for administering ETS Proficiency Profile, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the ETS Proficiency Profile for your institutional assessment?

The ETS Proficiency Profile was chosen for our institutional assessment for several
reasons.

1. The VSA allows for several measures of student learning. We had used the CLA and
CAAP in the past and wanted to pilot the ETS Proficiency Profile.
2. The instrument measured four core skill areas we believe to be important: critical
thinking, reading, writing and mathematics.
3. The ETS Proficiency Profile was cost effective and implementation was reasonable.


Which North Carolina State University students are assessed? When?

In Spring 2015, we assessed seniors through stratified samples of senior level/capstone courses.  Courses were selected across each of the nine colleges and attempted to include a variety of majors from within the colleges. Potential graduating seniors were those who had completed a sufficient number of credit hours by the end of the fall 2014 term such that they could potentially graduate the following semester.  There were 7,427 students who, at the end of fall 2014, had completed at least 105 credit hours. Our sample includes only students who had completed at least 105 credit hours by the end of fall 2014 term and were enrolled in one of the courses selected in our stratified sample. There were 16 senior/level capstone courses in total that participated, either by allowing the Office of Assessment to administer the assessment during class or by encouraging students to participate in the assessment outside of class.  Faculty who taught these senior level/capstone courses encouraged the participation of their students (in some cases, the assessment was administered during class) but participation was not required.  These courses came from eight of the nine colleges.  Additionally, all graduating seniors from one college (the college without an available capstone) were invited to participate. There were 1,095 students total who were invited to participate in the assessment.

Of the 1,095 students invited to participate through senior level and capstone courses/open invitation to their college, we assessed 238 seniors who met the criteria for credit hours completed.


How are assessment data collected?

How are assessment data collected?

Graduating seniors were identified by using institutional data on credit hour completion at the end of the prior term in order to identify students who could potentially graduate the following semester. The Office of Assessment worked closely with faculty across colleges to identify senior/capstone level courses in which a large number of potentially graduating seniors are enrolled. These students were contacted via an email from the Associate Vice Provost requesting their participation.  Staff members from the Office of Assessment administered the pencil and paper test on campus during multiple dates in early spring 2015 semester.   The assessment also included a 5-item survey at the end which asked students to self-report how much effort they put into the assessment and how important it was to them to do well (Student Opinion Survey, Sundre & Moore, 2002).  We used the “Effort” scale from this survey to better understand scores in the context of the amount of effort put forth by students. 

 


How are data reported within North Carolina State University?

How are data reported within North Carolina State University?

The data were analyzed by members of the Office of Assessment using the raw data provided by ETS.  The results of the analysis will be shared with the Vice Chancellor and Dean of Academic and Student Affairs, the University Council on Undergraduate Education, associate deans of the colleges, the curriculum committees within the colleges, and the university community.


How are assessment data at NC State used to guide program improvements?

How have data led to program changes and improvements at NC State?

Due to sampling limitations specific to representativeness, the data have not been used to make program changes, but have been used to look at overall trends that highlight strengths and weaknesses which merit further study.  For example, results highlighted strengths of our students in mathematics and provided further justification for the selection of critical thinking as part of our Quality Enhancement Plan (QEP). 


Our sample is not representative of the senior population by college. We were unable to solicit participation from senior level/capstone courses in each of the colleges, and courses from some of the colleges that did participate had small numbers of enrolled students.

Although incentives were offered, we had difficulty recruiting participants which impacted some elements of representativeness of the sample. Chi-square Goodness of Fit tests indicated that the distribution of seniors taking the assessment test was significantly different from the true population of seniors by college (p=0.00) and gender (p=0.00); but the sample was not significantly different by race/ethnicity (p=0.12).

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

At NC State, senior students who completed the ETS Proficiency Profile Critical Thinking test (n=238) scored higher than 52% of seniors at all other ETS Proficiency Profile-participating institutions in Spring 2015 Spring.

At NC State, senior students who completed the ETS Proficiency Profile Written Communication test (n=238) scored higher than 57% of seniors at all other ETS Proficiency Profile-participating institutions in Spring 2015 Spring.

As NC State did not participate in a value-added administration, scores are not adjusted to account for the incoming ability of NC State students.


Critical Thinking Detail

The chart below shows the distribution of student scores on the ETS Proficiency Profile Critical Thinking test. Students are scored as Not Proficient, Marginal, or Proficient.



Written Communication Detail

The charts below show the distribution of student scores on the three levels of the ETS Proficiency Profile Writing Test. Students are scored as Not Proficient, Marginal, or Proficient on each level. Writing 3 represents more advanced writing skill than Writing 2, which represents more advanced writing skill than Writing 1.