CP Logo

The University of Montana College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

The University of Montana Learning Outcomes

Academic assessment is an ongoing process of the highest priority at the University of Montana. Our accrediting agency, the Northwest Commission on Colleges and Universities requires that we focus our attention on the quality and effectiveness of our programs. According to Dr. Tom Angelo (1995), assessment is "an ongoing process aimed at understanding and improving student learning. It involves: 1) making expectations explicit and public; setting appropriate criteria and high expectations for learning quality; 2) systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards; and 3) using the resulting information to document, explain, and improve performance.” Simply put, assessment (often called outcomes assessment) is the process by which we evaluate the curriculum, plan improvements when necessary, and evaluate the effects of the changes. Assessment helps departments affirm those things in their curriculum and courses that are going well. Assessment also helps identify areas for improvement, and often points to the specific changes that might be needed. Assessment is not static; rather, it is an ongoing and continuous effort to improve the quality of instruction, student learning, and overall effectiveness of a department or unit. At the University of Montana, the steps for developing an assessment plan include: • Reviewing, revising, and/or writing your unit’s mission statement; • Developing goals and objectives for your program; • Identifying the educational experiences or activities for attaining goals and objectives; • Identifying measures to assess progress toward meeting your goals; • Developing a plan for gathering the data; • Collecting, analyzing, and interpreting the data; • Using the data to continuously revise and improve students’ educational experiences and activities; • Communicating the results. Departments are required to report on their assessment efforts biennially. The biennial report includes the department's assessment plan and communicates the assessment results from the previous year. Once submitted, the reports are reviewed by Dr. Nathan Lindsay, Associate Provost for Dynamic Learning, and then published to the Department Reports website. After the reports have been submitted, the Assessment Advisory Committee reviews each report and provides timely feedback to help departments improve their assessment efforts in the future. The Assessment Advisory Committee assesses the reports using a four-point rubric in the areas of mission, goals, indicators, and modifications and planning. In addition to departmental assessments, one indirect assessment that is conducted at the University of Montana is the National Survey of Student Engagement (NSSE). In February 2013, first-year and senior students were invited to participate in NSSE. More than 1,000 students completed the survey. A few of the findings are outlined below: • One page summary of UM NSSE Results • Differences in Engagement between first-year and senior students • NSSE Results: Student Peer Interactions and Diversity on Campus




The University of Montana administered the CLA in 2011 - 2012.

The University of Montana conducted a Value-added administration of the CLA in 2011 - 2012. The results are displayed below in the SLO Results tab.

For additional information on UM’s process for administering CLA, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the CLA for your institutional assessment?

The Collegiate Learning Assessment (CLA) was chosen as a means to assess whether students were attaining strong analytical, quantitative, information and communication skills. The CLA is one of the primary ways that we assess these General Education learning outcomes at our institution. The University of Montana first conducted the CLA, a product of the Council for Aid to Education (CAE), in 2006-2007.


Which The University of Montana students are assessed? When?

Data for the Collegiate Learning Assessment were collected from first-year students in Fall 2011, and from graduating seniors in Spring 2012.


How are assessment data collected?

Data for the Collegiate Learning Assessment were collected from first-year students in Fall 2011, and from graduating seniors in Spring 2012. Students were recruited through flyers posted around campus, and were offered incentives for taking the assessment, such as partial credit for their courses, or a cash incentive. Students came to a computer lab, where they could be proctored, as they took the essay exam.


How are data reported within The University of Montana?

The data were aggregated by class level, as first-year student CLA scores were compared with the scores from seniors. Prior reports from the CLA placed the University of Montana in the top 20% of institutions in value-added gains from first-to-senior year results. The most recent report, however, indicated less improvement over the four-year period.


How are assessment data at UM used to guide program improvements?

Campus discussion regarding results has centered on students’ writing and critical thinking abilities, areas emphasized by the CLA. The findings have provided additional context for the writing assessment conducted on campus, as outlined below. In fall 2011, the University’s Writing Committee assessed the process by which faculty request that courses be designated for writing credit and quantified how many courses initially fail to meet the criteria and why. They also developed a rubric for assessing students’ writing at midcareer. In spring 2012 and 2013, faculty members from across campus held retreats to test and evaluate the efficacy of the rubric and assessment process. In tandem, seven academic departments developed rubrics for discipline-specific assessment of writing. The new assessment process was initiated in December 2013. The Writing Committee held a Writing Retreat in April 2014 in which faculty used the new rubric to assess a representative sample of student papers drawn from all mid-level writing courses offered in spring 2014. The results have helped us identify areas of strength and weakness in students’ writing that can be relayed to faculty teaching writing courses as well as department chairs.


Of 2733 freshmen students eligible to be tested, 82 (3%) were included in the tested sample at The University of Montana.


Of 2948 senior students eligible to be tested, 87 (3%) were included in the tested sample at The University of Montana.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 50% 74% 54% 59%
Male 50% 26% 46% 41%
Other or Unknown <1% <1% <1% <1%
Race/
Ethnicity
US Underrepresented Minority 12% 9% 12% 10%
White / Caucasian 71% 88% 81% 75%
International 1% <1% 2% 2%
Unknown 15% 4% 5% 13%
Low-income (Eligible to receive a Federal Pell Grant) 55% 49% 50% 46%

The recruitment of students to participate in the sample occurred through flyers that were posted throughout campus, as well as invitations that were extended in specific courses. Thus, students in the study could be described as both a random and convenient sample. Specific sub-populations of students were not targeted, so it may be that the sample was not fully representative of the gender and racial/ethnic diversity at the University of Montana.

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the analytic writing task is at or near what would be expected at an institution testing students of similar academic abilities.

Seniors Detail

The charts below show the proportion of tested seniors who scored at each level of the nine subscales that make up the CLA. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.


Performance Task Make-an-Argument Critique-an-Argument
Analytic Reasoning and Evaluation
Writing Effectiveness
Writing Mechanics
Problem Solving

Freshmen Detail

The charts below show the proportion of tested freshmen who scored at each level of the nine subscales that make up the CLA. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.


Performance Task Make-an-Argument Critique-an-Argument
Analytic Reasoning and Evaluation
Writing Effectiveness
Writing Mechanics
Problem Solving