CP Logo

Kansas State University College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

Kansas State University Learning Outcomes

The Office of Assessment believes in a cooperative approach focused on student-centered learning.  Within a culture of trust and shared responsibility, faculty and student life professionals—with participation from students, administrators, alumni and K-State constituents—develop and implement ongoing and systematic assessment strategies to understand what, how much, and how students learn in order to continuously improve learning outcomes.  To assist faculty, the Office of Assessment provides support, resources, and training to help develop and implement assessment practices tailored to the faculty's needs.  

The Office of Assessment supports the K-State community's efforts to continuously improve student learning by:

  • Consulting with and assisting faculty and staff in planning, conducting, and interpreting assessment activities
  • Coordinating university-wide assessment activities
  • Serving as a resource on assessment issues
  • Working on a variety of special projects, as requested

For examples of assessment initiatives, see http://www.k-state.edu/assessment/initiatives/Assessment%20Projects%20at%20K-State.pdf




Kansas State University administered the CLA+ in 2014 - 2015.

Kansas State University conducted a Value-added administration of the CLA+ in 2014 - 2015. The results are displayed below in the SLO Results tab.

For additional information on K-State’s process for administering CLA+, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the CLA+ for your institutional assessment?

The CLA+, which is given every 3 years, offers a value-added, constructed-response approach to the assessment of higher-order skills, such as critical thinking and written communication. Hundreds of institutions and hundreds of thousands of students have participated in the CLA to date. The institution - not the student - is the primary unit of analysis. The CLA is designed to measure an institution's contribution, or value added, to the development of higher-order skills. This approach allows an institution to compare its student learning results on the CLA with learning results at similarly selective institutions.


Which Kansas State University students are assessed? When?

Samples of approximately 100 first-year students and 100 seniors take the CLA+ every three years.  Samples are selected in a way that attempts to be as representative as possible of the entire university population.


How are assessment data collected?

First-year students take the CLA+ in the fall, Seniors take the CLA+ in the spring. 


How are data reported within Kansas State University?

The Council for Aid to Education (which administers the CLA) provides a report detailing results for both first-year and senior populations, and also compares results to other institutions that use the CLA.

CLA results are combined with other direct and indirect assessment indicators to create an overall picture of written communication and critical thinking. 


How are assessment data at K-State used to guide program improvements?

CLA results are compared to other direct and indirect measures of student achievement (surveys and program-level embedded assessment results).  The overall picture of student achievement is then used to decide on what should be focused on for improvement at the institutional level.  See the annual SLO Report for more details.  Once the focus is decided, administration then emphasizes these areas of improvement when providing feedback to programs on assessment processes.  


Of 3807 freshmen students eligible to be tested, 92 (2%) were included in the tested sample at Kansas State University.


Of 6686 senior students eligible to be tested, 95 (1%) were included in the tested sample at Kansas State University.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 52% 70% 48% 49%
Male 48% 30% 52% 51%
Other or Unknown <1% <1% <1% <1%
Race/
Ethnicity
US Underrepresented Minority 15% 15% 14% 9%
White / Caucasian 79% 83% 77% 84%
International 5% <1% 1% <1%
Unknown 1% 2% 2% 6%
Low-income (Eligible to receive a Federal Pell Grant) 22% 21% 24% 25%

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the selected-response questions is at or near what would be expected what would be epxected at an institution testing students of similar academic abilities.

Seniors Detail

The charts below show the proportion of tested seniors who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 580.0
Critical Reading & Evaluation (Range: 200 to 800) 574.0
Critique an Argument (Range: 200 to 800) 593.0
Freshmen Detail

The charts below show the proportion of tested freshmen who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 529.0
Critical Reading & Evaluation (Range: 200 to 800) 519.0
Critique an Argument (Range: 200 to 800) 510.0