North Carolina A&T State University is committed to a process of assessment of both program and student learning outcomes. Each program, both undergraduate and graduate, has developed these outcomes related to the mission, vision, and values of the University. These outcomes are assessed annually and are used make continuous program improvements. As part of the assessment process, the institution has developed an Institutional Effectiveness Committee that supervise the assessment process for programs and student learning outcomes in their respective colleges and schools by working closely with the assessment committees, as well as with the department chairs and faculty; assessment reports for each degree program, critiquing and providing constructive feedback to the department chairs identify the assessment training needs of the faculty and to encourage and facilitate collaborations across schools and colleges and among departments, and ensure that key individuals are very skilled in assessment and can lead in changing the institutional culture to one that values assessment of learning and programs; and assist in developing and implementing policies and procedures related to program and learning assessment on the campus.
The assessment of student learning outcomes is re-enforced by NC A&T’s Quality Enhancement Plan (QEP) which assesses the improvements of critical thinking of our students and the new General Education Outcomes which are measured with course embedded assessments.
North Carolina Agricultural & Technical State University administered the ETS Proficiency Profile in 2010 - 2012.
North Carolina Agricultural & Technical State University conducted a Value-added administration of the ETS Proficiency Profile in 2010 - 2012. The results are displayed below in the SLO Results tab.
For additional information on NC A&T’s process for administering ETS Proficiency Profile, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.
The ETS Proficiency Profile measures critical thinking, analytic reasoning, and written communication which align with outcomes of our general education. In the past we have used CLA and for our Quality Enhancement Plan for Southern Association of Schools and Colleges (SACS) accreditation we have administered CAAP. The ETS Profienciency Profile was selected for comparisions of test used in the past and has been looked upon as a pilot for future assessment measures.
The ETS Proficiency Profile was administered in Fall 2011 to 204 first year students (freshmen) and to 198 seniors over the Spring 2012 and Fall 2012.
The administration of the ETS Profienciency Profile occured over a 12 month period. In the fall 2011 a sample of 204 first-year students from the University Survival (FRST 100) course for freshmen. The tests were administered during a regular class period. In the spring 2011 and fall 2012, 198 seniiors were adminstered the test during one period of the student's capstone course within their degree program. Administration was peformed in two semester in order to establish the required number for the analysis. All tested were adminsterd using paper and pencil.
Since the final administration was in fall 2012l, evaluation of the usefulness of the instrument is ongoing and results for academic programs has not been evaluated. The data will be analyzed as to the student learning outcomes associated with general education and the results will be distributed to Deans and Chairs at the end of the Spring 2013 semester.
The ETS Profieciency Profile was used mainly as a pilot to compare administrations of instruments in the used in the past for example with the Wabash Study or currently used in Quality Enhance Plan for accredidation. Since the administration was in fall 2012l, evaluation of the usefulness of the instrument is ongoing and results for academic programs has not been evaluated. The results will be distributed to Deans and Chairs at the end of the Spring 2013 semester.
Of 1875 freshmen students eligible to be tested, 204 (11%) were included in the tested sample at North Carolina Agricultural & Technical State University.
Of 3311 senior students eligible to be tested, 198 (6%) were included in the tested sample at North Carolina Agricultural & Technical State University.
Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.
Undergraduate Student Demographic Breakdown
|Eligible Students||Tested Students||Eligible Students||Tested Students|
|Other or Unknown||<1%||<1%||<1%||<1%|
|US Underrepresented Minority||96%||92%||88%||85%|
|White / Caucasian||2%||2%||6%||5%|
|Low-income (Eligible to receive a Federal Pell Grant)||69%||68%||58%||54%|
The samples used in the ETS Profieciency Profile were random for the first-year students (freshmen) since selection was based on what section you enrolled, however; for the seniors many were selected based on their major, therefore representation are from just a few majors.
The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base
The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.
The increase in learning on the analytic writing task is above what would be expected at an institution testing students of similar academic abilities.
The chart below shows the distribution of student scores on the ETS Proficiency Profile Critical Thinking test. Students are scored as Not Proficient, Marginal, or Proficient.
The charts below show the distribution of student scores on the three levels of the ETS Proficiency Profile Writing Test. Students are scored as Not Proficient, Marginal, or Proficient on each level. Writing 3 represents more advanced writing skill than Writing 2, which represents more advanced writing skill than Writing 1.