CP Logo

CUNY John Jay College of Criminal Justice College Portrait

Home Compare At A Glance Contact

EXPLORE THIS COLLEGE PORTRAIT

CUNY John Jay College of Criminal Justice Learning Outcomes

Assessment of student learning has become an ubiquitous and central activity at John Jay College. Assessment programs focused on student learning outcomes are in place and active for all undergraduate majors and graduate programs offered as well as for several college-wide programs (e.g., First-Year Experience Learning Communities). Each department develops, administers, updates, and expands its own assessment program. Most departments have standing assessment committees comprised of elected faculty members. Two-thirds of these committees have five or more faculty members. In some of the smaller departments, the entire faculty comprises an “assessment committee” that works on student learning assessment. Virtually all of the departmental assessment committees were established in 2009 or later. Progress and action began quite recently, but the College faculty has rapidly picked up the ball of assessment and run with it productively and widely (i.e., involving many faculty members).

Students’ learning is grounded in General Education degree requirements, a curriculum that has been recently reframed to focus squarely on student learning outcomes and strengthened to address changes in the student population, the assessment of learning, and new University-wide core requirements.  Learning outcomes in B.A. programs emphasize discipline-specific research skills, while those in B.S. programs highlight practical applications. All departments, of course, give a primary focus to student acquisition of a knowledge base in the major discipline. Demonstrable appreciation of ethics is included as a learning objective in most departments’ assessment plans and philosophies. Importantly, 20 of the 24 departments report that they aim to assess their success at educating “the whole student,” including writing and communication and critical thinking/analytical skills among the student learning outcomes of interest. Moreover, 19 of the 24 mention learning outcomes on skills and knowledge dimensions relevant to entry into graduate/professional school, major-related careers, or both. In graduate programs, expectations for learning are more demanding in order to prepare Master’s students to become practitioners or independent researchers in the chosen field.

Direct assessments of learning outcomes are being conducted in specific major courses and a senior-level capstone, using a rubric that matches specific courses to specific learning objectives or sub-objectives emphasized in those courses. Assessment planning follows a multi-year assessment cycle. Most assessments of programs have addressed at least 4 of the 5 key learning domains described previously, i.e., knowledge of theories and concepts, applications, research skills, critical thinking and reasoning, and communication. Findings from evaluating outcomes in the capstone experience courses provided the context for follow-up studies in 200- and 300-level courses.

Indirect assessments of how students fare on identified learning outcomes are also included in the assessment plans of more than three-fourths of the departments. These indirect assessments include Office of Institutional Research surveys, department-created surveys, and students’ grades.

Faculty involvement in assessment is widespread. Assistance and coordination of assessment efforts by administrative offices created or configured for the outcomes assessment mission has been increasingly forthcoming. Results of assessments, in general, suggest that learning objectives are being met, but not always to a degree that the faculty considers satisfactory. Appropriately, assessment data are already being used by departments to guide changes and improvements in both the curriculum and the classroom to better achieve learning objectives–not just to a satisfactory level, but to a superior level in some cases.

Closing the Loop. All of John Jay’s programs are using the results of their assessment findings to improve the curriculum and pedagogy. The primary action taken is revising courses. Other actions include increasing or changing specific assignments in existing courses, providing support structures such as tutoring or special help sessions, and reevaluating whether the learning goal or expectations for performance on that goal are appropriate.

One ubiquitous consequence of assessment efforts has been that learning objectives and their achievement is part of the conversation in all academic departments at John Jay. All departments report that they have included discussion of assessment data in their meeting agendas, often with a specific focus on how courses and/or curricula might be revised to improve learning outcomes.




CUNY John Jay College of Criminal Justice administered the CLA in 2012 - 2013.

CUNY John Jay College of Criminal Justice conducted a Value-added administration of the CLA in 2012 - 2013. The results are displayed below in the SLO Results tab.

For additional information on CUNY John Jay’s process for administering CLA, please click on the Assessment Process Tab below. For information on the students included in the administration, please click the Students Tested Tab.

Why did you choose the CLA for your institutional assessment?

Per the recommendation of the CUNY Task Force on System-Wide Assessment of Undergraduate Learning Gains (Assessment Task Force), the CUNY Chancellor decided that the Collegiate Learning Assessment (CLA) would be the standardized assessment instrument used to measure learning across CUNY colleges. The CLA offers a value-added, constructed-response approach to the assessment of higher-order skills, such as critical thinking and written communication.


Which CUNY John Jay College of Criminal Justice students are assessed? When?

Per CUNY’s decision of the value-added approach, the goals for the 2012-13 testing were 100 freshmen and 100 seniors. Freshmen were recruited in Fall 2012 while seniors were recruited in Spring 2013.Per CUNY’s decision of the value-added approach, the goals for the 2012-13 testing were 100 freshmen and 100 seniors. Freshmen were recruited in Fall 2012 while seniors were recruited in Spring 2013.


How are assessment data collected?

CUNY provided John Jay with the list of eligible freshmen and seniors for testing. Initially, a random sample of students was invited to participate. As this alone did not meet the testing targets, additional recruitment efforts were undertaken; it was a collaborative effort between many student-service and advising campus offices. Recruiting seniors required over 1,000 phone calls, the visiting of capstone courses, and even presenting eligible students with personalized registration invitations. A monetary incentive also helped.


How are data reported within CUNY John Jay College of Criminal Justice?

The CLA is a valuable tool for assessing how well John Jay College is doing to support student learning.  Already, it has provided valuable data about our entering 2012 freshmen class. For example, while their SAT scores predicted that they would be less prepared as compared with entering freshmen at other CLA-participating institutions, our freshmen scored in the 57th percentile on the CLA, which is well above their predicted range based on their SAT scores!  Thus, one thing we can conclude is that the SAT is not a good predictor of our students' abilities.


How are assessment data at CUNY John Jay used to guide program improvements?

The rubrics used by the CLA to evaluate student performance seem to be the most promising for assessing student learning. Their rubrics assess specific skills such as writing mechanics, writing effectiveness, analytical reasoning, and evaluation when applied to the tasks of making an argument, critiquing an argument, and performing a task (the latter of which also assesses problem solving skills). Currently, the Office of Outcomes Assessment is mapping the CLA scoring rubrics to the skills development courses in the General Education program. Lower-level skills from the rubrics are mapped to lower-level skills development courses, and the like is applied to higher-level skills and courses. In this manner, we can determine whether appropriate skill levels are met and by which courses. What began as a value-added assessment has evolved into a more practically applied assessment, one that directly helps improve areas of skills development deficiency.


Of 1793 freshmen students eligible to be tested, 100 (6%) were included in the tested sample at CUNY John Jay College of Criminal Justice.


Of 435 senior students eligible to be tested, 100 (23%) were included in the tested sample at CUNY John Jay College of Criminal Justice.


Probability sampling, where a small randomly selected sample of a larger population can be used to estimate the learning gains in the entire population with statistical confidence, provides the foundation for campus-level student learning outcomes assessment at many institutions. It's important, however, to review the demographics of the tested sample of students to ensure that the proportion of students within a given group in the tested sample is close to the proportion of students in that group in the total population. Differences in proportions don't mean the results aren't valid, but they do mean that institutions need to use caution in interpreting the results for the groups that are under-represented in the tested sample.

Undergraduate Student Demographic Breakdown

  Freshmen Seniors
Eligible Students Tested Students Eligible Students Tested Students
Gender Female 55% 62% 60% 65%
Male 45% 38% 40% 35%
Other or Unknown <1% <1% <1% <1%
Race/
Ethnicity
US Underrepresented Minority 73% 82% 69% 71%
White / Caucasian 24% 17% 29% 24%
International 3% 1% 2% 5%
Unknown <1% <1% <1% <1%
Low-income (Eligible to receive a Federal Pell Grant) 63% 64% 72% 75%

Our tested students included a higher proportion of females than in our student body. Our general population is 56% female. Our test sample was 64% female.

The VSA advises institutions to follow assessment publisher guidelines for determining the appropriate number of students to test. In the absence of publisher guidelines, the VSA provides sample size guidelines for institutions based on a 95% confidence interval and 5% margin of error. So long as the tested sample demographics represent the student body, this means we can be 95% certain that the "true" population learning outcomes are with +/- 5% of the reported results. For more information on Sampling, please refer to the Research Methods Knowledge Base

The increase in learning on the performance task is at or near what would be expected at an institution testing students of similar academic abilities.

The increase in learning on the analytic writing task is at or near what would be expected at an institution testing students of similar academic abilities.

Seniors Detail

The charts below show the proportion of tested seniors who scored at each level of the nine subscales that make up the CLA. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.


Performance Task Make-an-Argument Critique-an-Argument
Analytic Reasoning and Evaluation
Writing Effectiveness
Writing Mechanics
Problem Solving

Freshmen Detail

The charts below show the proportion of tested freshmen who scored at each level of the nine subscales that make up the CLA. The subscale scores range from 1 to 6 with 6 representing a higher or better score. Due to rounding, subscores may not total 100%.


Performance Task Make-an-Argument Critique-an-Argument
Analytic Reasoning and Evaluation
Writing Effectiveness
Writing Mechanics
Problem Solving