Publications and Resources

Engagement Matters: Volume 13, Issue 2

Director's Note
September 2016

Evelyn Waiwaiole

Welcome back to fall 2016. As the term begins and new and returning students settle in, I hope you are energized by their hopes and dreams as they advance on their academic journeys.

Here at the Center, we are looking forward to this upcoming year as we will be busy at work on two reports being released this spring. The first report, funded by the MetLife Foundation, will focus on the financial health of community college students and will be released in February. Approximately one in three American undergraduates receive a Pell grant and are therefore considered low-income status. And as more states offer free community college tuition and federal policy conversations advocating for free community college tuition continue to circulate, understanding the financial health of the students we serve could not be timelier.


Then, in April, the Center will release a report on "part-timeness." Just as there is much conversation about free tuition, there is a growing debate about the importance of students enrolling full time. This report will look at part-time students, their engagement, and their success based on course completion and persistence.

It's too early to know yet what stories the data will tell, but we look forward to sharing this work with you in the spring. Until then, stay in good touch.


Back to top

Updated CCSSE Instrument Coming in 2017

CCSSE 2017

Colleges participating in the 2017 administration of the Community College Survey of Student Engagement (CCSSE) will notice a refreshed survey instrument. Since 2001 CCSSE has provided actionable student engagement data to community colleges across the United States and beyond. Over its first four administrations the survey underwent several revisions, and in 2004 evolved into the instrument used by member colleges through the 2016 administration. The world of higher education has changed since 2004, though, and Center staff recognized a need to update CCSSE so that it would continue to provide actionable data to its member colleges.


Colleges familiar with the survey in its current form will see that on the surface very little has changed. The updated survey is still eight pages long, and its design and item-level formatting are consistent with those of the current survey. Most of the items on the updated survey have not changed at all from their current form, and the majority of those items that were revised underwent only minor adjustments to wording or response categories. Items that are no longer providing relevant data (outdated technology items, e.g.) have been eliminated, and the new instrument includes several items on High-Impact Practices (orientation, academic advising, etc.) that were previously administered as optional additional items. The refreshed survey also now includes items about library and active military/veteran services, as well as new demographic items about veteran/active military and college athlete status.

The sampling and administration procedures will not change, and the institutional reports will contain the same familiar data. Additionally, raw data files will still be available both on the website and upon request.

The refreshed instrument was never intended as an overhaul of an already-successful tool for data collection and benchmarking. Rather, the updates consist of a series of changes that will improve the quality of data available to colleges and make the survey more relevant in relation to changes in the community college field.

View a sample copy of the updated CCSSE.

View an item-by-item comparison of what has changed on the survey. Changes to existing items are indicated. Deleted items and new items are also flagged.

The Center will release a webinar in October detailing all of this information and more. To be notified of when the webinar is released, please contact


Back to top

Center Research Staff Launches SENSE Validation Project

SENSE logo

Nationally, community colleges enroll almost half of all American undergraduates; a quarter of those students do not persist to the second term, and nearly 50% drop out before their second year.

Findings from 20 years of research on undergraduate education, including over a decade of data from the Community College Survey of Student Engagement (CCSSE), have consistently found that the more actively engaged students are with faculty, staff, other students, and the subject matter, the more likely they are to be successful.  Yet because CCSSE is administered in the spring academic term, these findings have not addressed the engagement experiences of the students who do not persist to the second term.


In response to this gap in understanding the student experience, the Survey of Entering Student Engagement (SENSE) was created in 2007 to collect and disseminate information about institutional practices and student behaviors in the first few weeks of college. Through SENSE, colleges are able to gain insight into the entering student experience and to improve practices in ways that will increase student success in the first college year.

Since the survey was created, it has been refined through cognitive interviews and feedback from member colleges. The sampling, administration, data analysis, and reporting methods have been established as well; the missing piece, though, has been a validation of the survey. In March of this year, the Center received a three-year grant from The Greater Texas Foundation to conduct this work. This study seeks to provide scientifically-based evidence that the survey data are valid and reliable vis-à-vis early student experiences and outcomes. The validation study will include factor analyses to examine the six SENSE benchmarks; additionally, because so many students do not return after their first or even second term, the validation study will focus on early student outcomes including fall-to-spring and fall-to-fall persistence, completion of developmental courses, and completion of gatekeeper courses.

The Center has partnered with 10 diverse community and technical colleges in Texas to conduct the validation study. Partner colleges are currently administering SENSE. The survey data from students who complete the student identifier field will be matched with student record data including courses attempted and course outcomes for each term from fall 2016 through fall 2017. The results of this study will be released in the fall of 2018.

The Center would like to thank the following colleges for participating in this work: Angelina College, Blinn College, Costal Bend College, El Paso Community College, Lamar State College—Port Arthur, North Lake College, Southwest Texas Junior College, St. Philip's College, Temple College, and Texas State Technical College—Harlingen.


Back to top

Center Releases New Guide to Support Accreditation Work

Accreditation Guide

The Center is pleased to release a newly updated Accreditation Guide. Participation in Center surveys (the Community College Survey of Student Engagement [CCSSE] and the Survey of Entering Student Engagement [SENSE]) have long been a resource and a standard for measuring student engagement on community college campuses. However, what is not well-known is that these tools can assist a community college in making its case for meeting accreditation standards. While Center survey data are not direct measures of student outcomes, they are measures of student behaviors: Center surveys measure the extent to which students are engaged in educationally meaningful activities that are empirically linked to student success. If used systematically over time, Center survey data can provide deep insights on the appropriateness of institutional goals and demonstrate the extent to which a college is meeting its educational objectives.


Several years ago, the Center released its initial series of accreditation guides, which included survey items directly mapped to regional standards and sub-standards. With the continual evolution of accrediting criteria, however, Center staff found it difficult to keep the guides up to date. Therefore, the Center moved to a comprehensive guide that would always be available and useful for member colleges.

The Center created the new accreditation guide by synthesizing information from each of the six American regional higher education accreditors and identifying five overarching concepts applicable to Center survey items: Physical and Technical Resources, Teaching and Learning, Student Support Services, Institutional Effectiveness and Planning—Education Programs, and Institutional Effectiveness and Planning—Student Services.

These concepts were matched to individual items in the CCSSE and SENSE instruments. As standards vary across accrediting bodies, this guide serves as a general tool for using Center survey data to supplement an institution's accreditation work. Each institution is encouraged to use the guide in the way that best supports its work.

The guide also features vignettes, or examples of how institutions from each of the accrediting bodies have used Center survey data to support the work. The Guide refers to items from the 2005–2016 CCSSE instrument and items from the current SENSE instrument; the Center plans to release an updated version of the guide in 2017 to reflect the refreshed CCSSE instrument. To view the guide, visit


Back to top

Klamath Community College Provides Professional Development to Help Engage Students Beyond the Classroom

Klamath Community College

Klamath Community College (KCC) in Oregon has a long-standing commitment to providing meaningful professional development for its faculty members. Results from Center for Community College Student Engagement surveys provide the college with an ability to assess the effectiveness of this investment in quality learning for students.

Through professional development that studies best practices for curriculum delivery, faculty have determined that engaging students requires a pedagogy that extends well beyond the traditional lecture methods. As a result, many faculty have taken steps to create a project- and challenge-based curriculum delivery model.


For instance, KCC's Community Based Capstone Project class culminates with students using the knowledge they have acquired to live stream KCC's graduation ceremony. This large-scale project is feasible because the cohort of students in this program of study work throughout the year on delivery of multimedia content for community projects. One of those projects is a partnership with the local newspaper to produce augmented reality supplements to news stories.

Additionally, the Office Procedures class partners with culinary classes and student clubs to deliver special events such as the Administrative Office Professionals Recognition Day. Students in the Office Procedures class are required to work with multiple classes and groups to plan a complete event from start to finish. Students are responsible for facilities use forms, budgeting, requisition of resources, planning an agenda, and all the steps between to deliver a meaningful event that can only be completed with teamwork. The college takes advantage of this great training by regularly hiring from this pool of graduates.

These examples represent some of the early adopters to this method of engaging pedagogy that is now spilling over to other departments. Students at KCC have had a history of coming to campus for class and leaving as fast as possible to tend to their busy lives off campus. But, engaging students beyond the classroom takes a concerted and planned approach to curriculum delivery. With this new approach to teaching and learning, faculty have discovered students learn and engage more when the projects are community centered, real, and legitimate.

How does KCC know this approach to project- and challenge-based delivery of curriculum is effective? Over time, the college has monitored Community College Survey of Student Engagement (CCSSE) and Survey of Entering Student Engagement (SENSE) items from the benchmark areas of Engaged Learning, Academic Challenge, and Student-Faculty Interaction such as "Work with other students on a project or assignment during class," "Work with classmates outside of class on class projects or assignments," "Participate in a required study group outside of class," and "Use an electronic tool to communicate with another student/instructor about coursework." Results from these items within the CCSSE and SENSE benchmarks are showing a positive trend over the past five years.

KCC will continue to monitor the results from Center surveys to assess how Engaged Learning and other benchmark areas are affected by institutional initiatives to improve quality learning.


Back to top

Cognitive Interviews: The Difference Between a Good Survey Item and a Great Survey Item

Klamath Community College

The landscape of higher education is constantly changing, and in response, the Center for Community College Student Engagement is continuously developing new survey items to augment the primary survey instruments. Beyond extensive research and deliberation among Center staff, an invaluable step in the creation of new survey items is the collection and analysis of data from cognitive interviews. Additionally, cognitive interviews are important for ensuring that the understanding of previously tested items does not change over time; in preparation for the launch of the updated CCSSE in 2017, the Center conducted cognitive interviews for the current survey items as well as the items that are being added to the updated version.


Cognitive interviews are crucial to survey item creation as they inform researchers about whether or not students interpret the items as intended. The first step in the cognitive interview process is a formal request to the college president or CEO seeking permission to conduct interviews at the college. The Center then works with an institutional contact on logistical details such as space for the interviews and recruiting and scheduling of student participants.

During cognitive interviews, Center staff follow a prescribed protocol. Students are first asked to read and sign a consent form. Then, they are given a copy of the survey items and asked to respond to them. Students are instructed to stop at specific intervals so the interviewer can ask non-leading questions about how the students are interpreting the items and whether anything about the items is confusing. Interviews are audio-recorded and last from 45 to 90 minutes, depending on the number of items.

Results of the interviews are analyzed by Center staff, and when themes surface, items are revised. For example, during the cognitive interviews on the CCSSE instrument, a large number of students did not understand what "synthesizing" meant in the item "During the current school year, how much of your coursework at this college emphasized ... Synthesizing and organizing ideas, information, or experiences in new ways." As a result, this item has been reworded on the updated version of CCSSE: "During the current academic year, how much has your coursework at this college emphasized ... Forming a new idea or understanding from various pieces of information."

Between the fall of 2014 and early 2016, the Center conducted cognitive interviews on the topics of Academic Advising and Planning, Engagement Through Technology, Information Literacy, Part-Timeness, and Race and Ethnicity, as well as the updated CCSSE instrument. We would like to thank the following colleges and their students for participation in this work: Chemeketa Community College (OR), Del Mar College (TX), Lone Star College—North Harris (TX), Maricopa Community Colleges (AZ), San Jacinto College (TX), and Temple College (TX).

To protect student privacy, interview data are used only by Center staff and are not published.


Back to top

Center Holds Annual Meeting With National Advisory Board

National Advisory Board

Since the Center's inception 15 years ago, it has received expert guidance from a small but distinguished group of community college leaders and outstanding researchers from across America. This group provides strategic and policy guidance during an annual two-day meeting and through correspondence throughout the year. The group met with Center staff during the annual meeting May 25–26 in Austin.


The National Advisory Board members with active terms at the time of the spring 2016 meeting include the following:

Dr. Walter G. Bumphus, President and CEO, American Association of Community Colleges

Dr. Peter Ewell, President Emeritus, National Center for Higher Education Management Systems

Dr. Cynthia Ferrell, Executive Director, Texas Success Center, Texas Association of Community Colleges (TX)*

Dr. Rufus Glasper, President and CEO, League for Innovation in the Community College

Dr. Maria Harper-Marinick, Chancellor, Maricopa Community Colleges (AZ)*

Dr. Audrey J. Jaeger, Professor of Higher Education and Alumni Distinguished Graduate Professor, North Carolina State University

Dr. Christine Johnson, Chancellor, Community Colleges of Spokane (WA)*

Dr. Steven L. Johnson, President and CEO, Sinclair Community College (OH)

Dr. Karon Klipple, Executive Director, Community College Pathways, Carnegie Foundation for the Advancement of Teaching*

Dr. William Law, President, St. Petersburg College (FL)*

Dr. Alexander C. McCormick, Associate Professor of Educational Leadership and Policy Studies; Director, National Survey of Student Engagement, Indiana University Bloomington

Dr. Cindy L. Miles, Chancellor, Grossmont-Cuyamaca Community College District

Dr. Dale K. Nesbary, President, Muskegon Community College (MI)

Dr. Lawrence A. Nespoli, President, New Jersey Council of County Colleges

Dr. Daniel J. Phelan, President, Jackson College (MI)

Dr. Vincent Tinto, Distinguished University Professor Emeritus, Syracuse University

Dr. Philip Uri Treisman, Professor of Mathematics, Professor of Public Affairs, and Executive Director, The Charles A. Dana Center, The University of Texas at Austin*

*Not pictured

Learn more about the Center's current advisory board here:


Back to top

Center for Community College Student Engagement
—a Service and Research Initiative—
Program in Higher Education Leadership | Department of Educational Leadership and Policy  | College of Education
The University of Texas at Austin
Comments to: