CP Logo

Western Michigan University College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

Western Michigan University Learning Outcomes

Assessment occurs on many levels at the university. Every department has an assessment plan that contains learning outcomes for their programs. It is expected that departments are constantly using assessment for improvement. The university has purchased TracDat from Nuventive in order to make reporting of assessment results uniform across the university. This software allows departments to input learning outcomes, means of assessing those outcomes, results of the assessment, actions taken due to the results and improvements that may have resulted from the changes. TracDat also allows departments to link learning outcomes for their programs to departmental, college and university strategic initiatives. The departmental assessment is monitored by each college. Deans are required to write regular assessment reports describing the activity in each department within the college. These reports are given to the University Assessment Steering Committee who reviews the reports. The intent of this process is to identify departments that may need assistance in assessment activities. The University Assessment Steering Committee also rewards good assessment via annual awards given to individuals and units.

The various areas of general education all have their specific learning outcomes. To assess these outcomes, faculty are asked to use the VALUE rubrics that have been developed by the Association of American Colleges and Universities, or rubrics developed by faculty committees here at Western Michigan University. Results of these assessment activities are shared with the University Assessment Steering Committee and administrators across campus. The Collegiate Learning Assessment exam has been administered 5 times in the last 7 years to assess our students abilities to write and critically think. Additional assessment includes the National Survey on Student Engagement (NSSE) and the Faculty Survey on Student Engagement (FSSE). The decision has recently been made to administer the NSSE/FSSE and the CLA on alternate years.

Recently, the University Assessment Steering Committee has been committed to requiring assessment plans for student support units such as University Libraries, Student Affairs and advising offices. These plans are now being submitted and approved by the University Assessment Steering Committee. It is expected that regular reports, such as those described above for the academic departments, will be written by these units and reviewed by the University Assessment Steering Committee to determine where help may be needed.

For other assessment information and initiatives, please go to:

http://www.wmich.edu/assessment/

For examples of student success, please go to:

http://scholarworks.wmich.edu/provost_prism
 




Western Michigan University administered the CLA+ in Fall 2013 - Spring 2014.

Western Michigan University conducted a Value-added administration of the CLA+ in Fall 2013 - Spring 2014. The results are displayed below in the SLO Results tab.

For additional information on WMU’s process for administering CLA+, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the CLA+ for your institutional assessment?

A group of faculty examined the three exams recommended by the VSA for assessment reporting. The committee felt that writing could not effectively be assessed using multiple choice questions (two of the options) and therefore chose the Collegiate Learning Assessment exam since it assesses actual writing samples.


Which Western Michigan University students are assessed? When?

Incoming freshmen are given the CLA+ in the fall of their first year. Graduating seniors are given the exam the last semester prior to graduation. Only graduating seniors who transfer in 20 or fewer credit hours are eligible to take the exam. This is to ensure that the learning that is measured occurred here at Western Michigan University and not at another institution.


How are assessment data collected?

The university uses the data that is provided by the CLA+ after their analysis. This data is then distributed to the Faculty Senate, University Assessment Steering Committee and various administrators around campus.


How are data reported within Western Michigan University?

Until the most recent administration of the CLA+, the university did not do any analysis of the data beyond that provided by the CLA+. This information, however, was not very helpful in making changes for improvement. In the most recent administration of the exam, we began to use the optional questions. For example, we asked students if they were in a major that required a capstone experience since not all of our programs have such a requirement. Many of the questions were related to questions within the NSSE so that we can begin to link the results of those two assessment tools.
 


How are assessment data at WMU used to guide program improvements?

The plan for the future is to use the results from the optional questions to guide improvements. For example, if we find that students who are in programs that require a capstone experience perform better in critical thinking, we will need to consider requiring a capstone experience in every program.


Of 3128 freshmen students eligible to be tested, 82 (3%) were included in the tested sample at Western Michigan University.


Of 1366 senior students eligible to be tested, 69 (5%) were included in the tested sample at Western Michigan University.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 52% 63% 54% 58%
Male 48% 37% 46% 42%
Other or Unknown <1% <1% <1% <1%
Race/
Ethnicity
US Underrepresented Minority 26% 22% 17% 19%
White / Caucasian 73% 74% 79% 75%
International <1% <1% 3% <1%
Unknown 1% 4% 2% 6%
Low-income (Eligible to receive a Federal Pell Grant) 35% 40% 27% 38%

The first year we administered the exam we did examine the relationship between the students who took the exam and the cohort from which they came. Our results showed that the sample who took the exam was fairly representative of the cohort when looking at things like the percent of minority students and ACT scores. One difference between the sample taking the exam and the cohort was that a higher percentage of women took the exam. We still need to do an analysis of the last three administrations of the exam.

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the selected-response questions is at or near what would be expected what would be epxected at an institution testing students of similar academic abilities.

Seniors Detail

The charts below show the proportion of tested seniors who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 566.0
Critical Reading & Evaluation (Range: 200 to 800) 548.0
Critique an Argument (Range: 200 to 800) 566.0
Freshmen Detail

The charts below show the proportion of tested freshmen who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 513.0
Critical Reading & Evaluation (Range: 200 to 800) 494.0
Critique an Argument (Range: 200 to 800) 502.0