CP Logo

California Maritime Academy College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

California Maritime Academy Learning Outcomes

Assessment of student learning occurs on both the institutional and program level.  The Institution-Wide Assessment Committee utilizes a four-year cycle to assess ten student learning outcomes  which are consistent with the institutional mission, including critical and creative thinking, leadership and teamwork, lifelong learning, ethical awareness and global stewardship.  Additionally, degree-granting programs conduct discipline specific assessment to measure expertise in the concepts and technologies of a chosen field.  Finally, Cal Maritime participates in broader mechanisms of student learning, including the Collegiate Learning Assessment. 




California Maritime Academy administered the CLA+ in 2014.

California Maritime Academy conducted a Value-added administration of the CLA+ in 2014. The results are displayed below in the SLO Results tab.

For additional information on CMA’s process for administering CLA+, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the CLA+ for your institutional assessment?

The CLA+ was selected by the California State University System as a system-wide value added assssment of students' higher order skills, incuding analysis and problem-solving, writing effectiveness, and writing mechanics.


Which California Maritime Academy students are assessed? When?

Each fall semester a random sample of entering freshmen are tested. Each spring semester a random sample of graduating seniors are tested.


How are assessment data collected?

The CLA+ prepares the data from each testing period into an annual report for both campus use and system comparison.


How are data reported within California Maritime Academy?

The data are reviewed bi-annually, by the Accreditation Liaison Officer and the Director of the Center for Student Engagement and Academic Support.  The data are then disseminated to the Institution-wide Assessment Committee.


How are assessment data at CMA used to guide program improvements?

The assessment data are used to help inform changes to the campus graduate writing assessment record (GWAR) as well as our senior level advanced writing courses and our lower level critical thinking courses.


Of 175 freshmen students eligible to be tested, 24 (14%) were included in the tested sample at California Maritime Academy.


Of 487 senior students eligible to be tested, 8 (2%) were included in the tested sample at California Maritime Academy.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 11% 13% 16% 38%
Male 89% 88% 84% 63%
Other or Unknown <1% <1% <1% <1%
Race/
Ethnicity
US Underrepresented Minority 30% 21% 24% 63%
White / Caucasian 61% 58% 61% 38%
International <1% <1% <1% <1%
Unknown 3% 4% 6% <1%
Low-income (Eligible to receive a Federal Pell Grant) 35% <1% 35% <1%

Our testing sample reflects our student body which is predominantly male and reflects our diversity ratio on campus. Because our student population is so small (1050), our sample size is also small. The voluntary nature of the exam presents a challenge to get a true cross-section of the student population. 

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the selected-response questions is at or near what would be expected what would be epxected at an institution testing students of similar academic abilities.

Seniors Detail

The charts below show the proportion of tested seniors who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 708.0
Critical Reading & Evaluation (Range: 200 to 800) 640.0
Critique an Argument (Range: 200 to 800) 808.0
Freshmen Detail

The charts below show the proportion of tested freshmen who scored at each level of the nine subscales that make up the CLA+. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.

Performance Task
Analysis & Problem Solving
Writing Effectiveness
Writing Mechanics

The table below shows students' mean scores on the three subscales that make up the Selected-Response Questions section of the CLA+. The students subscores are determined by the number of correct responses in each subsection, with those raw numbers adjusted based on the difficulty of the question set the students received. Individual student scores are rounded to the nearest whole number.

Subscale Mean Student Scores
Scientific & Quantitative Reasoning (Range: 200 to 800) 498.0
Critical Reading & Evaluation (Range: 200 to 800) 530.0
Critique an Argument (Range: 200 to 800) 543.0