Student Learning Outcomes (SLO)

Fall 2018: New Assessment Cycle

Below are some frequently asked questions about SLO assessment.

To get additional information or to make sure you're working down the right path, please contact Student Learning Outcomes Assessment Coordinator Madeleine Murphy at murphym@smccd.edu or (650) 574-6344.



What is this new assessment requirement?

Our assessment requirement is laid out in our -soon-to-be-revised Program Review, which asks departments to discuss two things:
  1. their assessment of student learning in their degree or certificate program;
  2. at least one activity, participated in or initiated by faculty in the department during the two-year period, that focused on improving student learning.
How is this different from what we're doing today?

A little context might help here.

For over ten years, accreditors across the country have asked colleges to gather and assess data on student learning, and to use these assessments to improve student learning.

At CSM, we focused on ensuring that we were complying with these accreditation requirements, emphasizing the importance of completeness (i.e., collecting SLO data on every outcome over a 3-year period).

However, we wanted to connect the SLO process more clearly to activities that improved student learning, and to create more opportunities for faculty to work on interdisciplinary learning outcomes.

So, starting in Fall 2018, we are coming at this requirement from the other end. We won't start by reporting which course outcomes got assessed or what results they yielded. We'll start by answering questions about the health of student learning in our degrees and certificates, and about whatever student learning activity we initiated or participated in.

How are we supposed to assess our programs?

Assessing student learning in our programs means checking to see if our students really are taking away the knowledge, skills and abilities defined in the program outcomes.

This involved two activities:
  1. Reviewing the wording and alignment of outcomes for clarity, currency and coherence. Do the program outcomes clearly define the program's learning goals? Do the outcomes of core courses clearly lay out skills and knowledge that will help students achieve those goals? Is anything missing, or repeated? What about courses outside your discipline - are these still current? Do their outcomes connect to the knowledge or skills defined in the program outcomes?
  2. Taking a snapshot of student learning. This is tricky for degrees and certificates that do not end in a capstone courses, but there are ways one can take the temperature of student learning: surveys, course-level assignment data, student outreach, success in licensure exams , and other options.
There will be institutional support in the form of flex-day workshops and online tools for assessment. (See "What Institutional Support Can We Expect?")

What if we don't have a program?

A program is defined as a degree or certificate, which not all disciplines offer.

However, the real spirit of assessment is to start at the endpoint - that is, the place that the students are working towards.

So if your discipline doesn't offer a certificate, but does offer sequenced courses, you can look at the overal sequence of courses as a "program".

Or if your discipline mostly supports an institutional outcome, you will focus on participating in institutional assessment activities, and look at student learning in your courses in that context.

What sort of activities come under the heading of "assessment" activities?

Obviously, assessing student learning is about 85% of our job as teachers - we use formative assessments (i.e., class assignments) to see what knowledge, skills or abilities need reinforcing; and we summative assignments (i.e., the grade) to render a verdict on the student's overall performance

SLO assessment, however, asks a slightly different question: not "How did the student do?" but rather "How is this course (or degree, or general education pattern) doing?" The goal is to use evidence of student learning to examine strengths and weaknesses in our pedagogy, curriculum, and institutional support, and to find ways to do better - that is, to help students do better.

All kinds of activities might serve this purpose. Here are some examples:
  • A CTE program asks itself, "Does our degree prepare students adequately for the workplace?" Faculty reach out to recent graduates to ask how prepared they felt for the workforce. Ten students reply, most saying that they wish they'd done more collaborative projects. So next semester, the department adds language to the course and program outcomes emphasizing group work, and includes an assignment focusing on group work.
  • The English department asks itself: "Sentence-level writing is a problem: let's priotitize it - but what approach works best?" Teachers choose different methods to trial over the coming semester, and at the end of the term, the faculty compare notes: did one approach work best?
  • Faculty ask themselves: "How well are we achieving our GE goals of helping students to think critically?" A group of faculty gather to discuss critical thinking, isolate a specific critical thinking task they all address in their courses, create a shared rubric, and conduct class-based assessments. Conclusions are shared and discussed at a follow-up forum.
And there are so many more.

Some of these activities can be organized by discipline faculty, while others should be ongoing institutional efforts in which faculty participate.

What institutional support can we expect?
  • Flex day workshops. The college will offer regular sessions on how to assess programs, what sort of activities to engage in, how to write outcomes, how to align outcomes, how to connect this to collecting learning outcomes data.
  • Institutional learning activities. The College Assessment Committee will organize interdisciplinary activities to bring together faculty to discuss and assess student learning in core competencies (i.e., institutional learning outcomes).
  • Regular reports. The College Assessment Committee will synthesize assessment activities reported in Program Review and report these out to the college community.
  • Resource materials. The College's SLO "toolbox" will be substantially revised to include useful resources for faculty looking to conduct assessments.
  • Personal attention. Email the SLO coordinator!

Will this take a ton of time?

For most faculty, the assessment requirement occupies 2-3 activities, many of which can be completed during a flex days. Bear in mind that it's each discipline that needs to report on assessment activities, not each individual faculty member.

Does this mean SLOs are going away?

No we're still going to be assessing student learning at the course, program, service and institutional level.

But instead of trying to gather data on every outcome in hopes of formulating questions, we'll start with the questions, then gather data on those outcomes that help us answer them. All of our courses support program outcomes, institutional outcomes, or both - so all of our courses will be involved in regular assessment review.

Why are we changing things?

In Spring 2016, SLO leads in each department were interviewed to find out what was working in our assessment cycle, and what needed improvement, as part of an effort to assess our assessment process.

The emerging recommendation was that while we were in compliance with accreditation requirements, there was room to improve the SLO process so that it directly supported improvements in students learning at the course, program, and institutional level. Faculty expressed a desire for more opportunities to work with colleagues across disciplines and departments to support student learning.


The new approach focuses on addressing issues of student learning, and places SLO data collection in a clear research context. It also emphasizes institutional and interdisciplinary learning. this fits with the College's embrace of guided Pathways and other models intended to support a more coherent, cohesive educational experience for students. The overarching goal is to establish a culture of assessment, in a community of practice.

So, what should I be doing this semester (i.e., Spring 2018)?

If you are all ready to complete an assessment, or collect data, go ahead. The data you collect may well help you answer at least one of the assessment questions in Program Review. And to be clear: if what you are currently doing works well for you, by all means, keep doing it.

What about Tracdat?

For the time being, our institutional focus is on Program Review. this is where we will work for discussions of assessment activities, and related SLO data. However, faculty are welcome to continue using Tracdat, if this is the most convenient way to store your learning outcomes data. And your existing data will still be available.