CP Logo

New Mexico Highlands University College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

New Mexico Highlands University Learning Outcomes

New Mexico Highlands University conducts yearly assessments of our academic programs which are then linked to student learning outcomes at the program level and unversity wide. We administered the Collegiate Learning Assessment in 2008 and again in 2014-2015. 




New Mexico Highlands University administered the CLA+ in 2014 - 2015.

New Mexico Highlands University conducted a Value-added administration of the CLA+ in 2014 - 2015. The results are displayed below in the SLO Results tab.

For additional information on NMHU’s process for administering CLA+, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the CLA+ for your institutional assessment?

 

The Collegiate Learning Assessment (CLA) is administered to freshmen students in the fall semester and graduating senior students in the spring semester. The test is designed to assess students’ abilities to “think critically, reason analytically, solve problems, and communicate clearly and effectively.” It is a written exam (not multiple choice).


Which New Mexico Highlands University students are assessed? When?

 

The CLA was administered for the second time at Highlands’ in the 2014-2015 academic year. Freshmen were assessed in the fall and seniors were assessed in the spring. 


How are assessment data collected?

Freshmen and seniors were randomly chosen to particpate in the CLA.


How are data reported within New Mexico Highlands University?

The tests were scored by CLA and Highlands was provided with the report. 


How are assessment data at NMHU used to guide program improvements?

CLA results are considered along with other forms of student learning assessment in designing and implementing program improvements. 


Of 275 freshmen students eligible to be tested, 68 (25%) were included in the tested sample at New Mexico Highlands University.


Of 357 senior students eligible to be tested, 32 (9%) were included in the tested sample at New Mexico Highlands University.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 49% 49% 69% 72%
Male 51% 51% 31% 28%
Other or Unknown <1% <1% <1% <1%
Race/
Ethnicity
US Underrepresented Minority 83% 79% 68% 59%
White / Caucasian 11% 16% 23% 16%
International 5% 3% 6% 9%
Unknown 1% 1% 2% 16%
Low-income (Eligible to receive a Federal Pell Grant) 58% 47% 75% 56%

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the selected-response questions is at or near what would be expected what would be epxected at an institution testing students of similar academic abilities.

Seniors Detail

The charts below show the proportion of tested seniors who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 455.0
Critical Reading & Evaluation (Range: 200 to 800) 438.0
Critique an Argument (Range: 200 to 800) 465.0
Freshmen Detail

The charts below show the proportion of tested freshmen who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 454.0
Critical Reading & Evaluation (Range: 200 to 800) 438.0
Critique an Argument (Range: 200 to 800) 465.0