CP Logo

The Citadel, The Military College of South Carolina College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

The Citadel, The Military College of South Carolina Learning Outcomes

The Citadel identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in its’ educational programs. Assessment of student learning outcomes at The Citadel involves the systematic collection, evaluation, and use of data to improve teaching, institutional effectiveness, student learning, and student development.  Educational programs at The Citadel employ a wide array of strategies for assessing student learning outcomes including standardized tests, locally-developed exams, writing assignments with rubrics, capstone experiences, portfolios, observation, interviews and focus groups, course evaluations, and survey instruments.  These program assessment measures are supplemented by institutional assessments such as the Collegiate Learning Assessment (CLA), Multi-Institutional Study of Leadership (MSL), ETS Measure of Academic Progress (MAPP), National Survey of Student Engagement (NSSE), the National Collegiate Health Assessment, Leadership Writing Assessment, and The Citadel Experience Survey.

 

The Citadel’s E-Leadership Portfolio is a required four-year campus-wide initiative designed to document principled leadership and assess leadership knowledge, competency, and growth over time.  The electronic portfolio also serves as a powerful mechanism for assessing general education learning outcomes, program-level assessment, and informs institutional continuous improvement efforts.  The majority of the E-Leadership Portfolio artifacts use the American Association of Colleges and Universities (AAC&U) VALUE rubrics to assess written communication, critical thinking, ethical reasoning, and quantitative reasoning over time.




The Citadel, The Military College of South Carolina administered the CLA+ in 2012 - 2015.

The Citadel, The Military College of South Carolina conducted a Value-added administration of the CLA+ in 2012 - 2015. The results are displayed below in the SLO Results tab.

For additional information on The Citadel’s process for administering CLA+, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the CLA+ for your institutional assessment?

The Collegiate Learning Assessment Plus (CLA+) is part of The Citadel’s comprehensive assessment program measuring student learning outcomes. The Citadel chose the CLA+ because it is a nationally-normed and validated instrument that provides true student performance data regarding students’ critical thinking and written communication skills, which the college has identified as key learning outcomes for students.


Which The Citadel, The Military College of South Carolina students are assessed? When?

For the CLA+, the Citadel assesses a random sample of approximately one-hundred first-year students during the fall semester and one-hundred senior students during the spring semester each academic year.


How are assessment data collected?

Data from students who take the CLA+ is collected by The Council for Aid to Education, and a comprehensive report is provided to the college.


How are data reported within The Citadel, The Military College of South Carolina?

Aggregate CLA+ data is analyzed by the institution to look for trends and identify strengths and opportunities for continuous improvement efforts. Results are reviewed by college leaders and academic department heads, who share results with college faculty to develop goals and strategies for continuous improvement.


How are assessment data at The Citadel used to guide program improvements?

The CLA+ and other assessment instruments provide data regarding the college’s general education student learning outcomes. Faculty and college leaders with responsibility for the general education program review results annually in order to measure progress and identify areas for continuous improvement efforts. Continuous improvement plans are enacted at the beginning of each academic year and reviewed at year-end.


Of 1865 freshmen students eligible to be tested, 306 (16%) were included in the tested sample at The Citadel, The Military College of South Carolina.


Of 1493 senior students eligible to be tested, 268 (18%) were included in the tested sample at The Citadel, The Military College of South Carolina.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 7% 6% 7% 8%
Male 93% 94% 93% 86%
Other or Unknown <1% <1% 1% 6%
Race/
Ethnicity
US Underrepresented Minority 23% 15% 20% 14%
White / Caucasian 76% 83% 78% 83%
International 1% <1% 1% <1%
Unknown 1% 2% 1% 6%
Low-income (Eligible to receive a Federal Pell Grant) 23% 22% <1% <1%

For CLA+ assessment, the institution utilizes a randomized sample, so results are considered representative of the entire student population.

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is well below what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the selected-response questions is well below what would be expected what would be epxected at an institution testing students of similar academic abilities.

Seniors Detail

The charts below show the proportion of tested seniors who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 500.0
Critical Reading & Evaluation (Range: 200 to 800) 501.0
Critique an Argument (Range: 200 to 800) 548.0
Freshmen Detail

The charts below show the proportion of tested freshmen who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 522.0
Critical Reading & Evaluation (Range: 200 to 800) 534.0
Critique an Argument (Range: 200 to 800) 527.0