Learning about Student Learning from Community Colleges
January 11, 2009
A Carnegie Perspectives repost
By Pat Hutchings and Lee Shulman
It’s hard to find a campus in these days of number crunching and accountability that doesn’t have some kind of office of institutional research. These offices vary a lot, with large research universities supporting a staff of a dozen or more, and small colleges sometimes relying on a person—or half a person—to get the job done. But what exactly is the job? Traditionally, institutional research has been treated as a kind of company audit, sitting outside the organization’s inner workings but keeping track of important trends and facts—about enrollment patterns, student credit hours, graduation rates, peer institutions, and so forth—requested by both internal and external constituencies.
But imagine a different way of thinking about institutional research as a capacity to work closely with faculty to explore questions about what students are actually learning. Such a shift would mean asking much tougher, more central questions: What do our students know, and what can they do? What do they understand deeply? What kinds of human beings are they becoming—intellectually, morally, in terms of civic responsibility? How does our teaching shape their experience as learners, and how might it do so more effectively?
As part of a Carnegie Foundation project focused on pre-collegiate, developmental education in community colleges (in partnership with The William and Flora Hewlett Foundation, we are working with 11 institutions in California), we recently brought together a group of institutional research directors and faculty to talk about the kinds and sources of data that are needed to improve teaching and learning for the many students who are unprepared to enter college-level courses and who often fail on the long road through one remedial course after another. On the one hand, institutional research is an underfunded, undervalued function on many two-year campuses, and we heard from those who work in IR offices about the frustration of spending scarce time and resources generating information that faculty never see. At the same time, we heard from faculty who wish that the kinds of evidence that are most important for making changes at the classroom level could be made more readily available, and be more valued, at “the top.” But we also heard about some encouraging efforts to bridge these gaps.
At Los Medanos College, for example, getting better information to guide improvement has been part of a shift of focus from “the underprepared student” to “the prepared institution.” The college’s Developmental Education Committee works with staff from the Office of Institutional Research to develop a research agenda that yields data faculty members can use to monitor improvements in student learning. Recently, the Committee asked the IR office to study the relative success rates in elementary algebra of students who had different levels of preparation—requiring data much more specific than what is usually provided by the IR office for program review. “We gathered this data over a two-year period and discovered significant differences in success rates based on type of preparation,” Myra Snell, a professor of mathematics, told the group. “This information was instrumental in several changes: We established a prerequisite for elementary algebra, changed scheduling patterns in the math department, and are now experimenting with different modes of instruction for basic skills curriculum.”
City College of San Francisco—a much different, much larger institution—has developed a Web-based Decision Support System. The DSS contains data from 1998 through the present on student enrollment, student demand for classes, departmental productivity, student success as measured by grades, course completion, degrees and certificates, and student characteristics, all of which are available in response to queries from faculty and staff. Thus, an instructor of pre-collegiate English might use the system to find out if different student groups—by race or age—are particularly at risk in a key sequence of courses in which he or she is teaching. The department might use the system to see how changes in teaching and curriculum are reflected, or not, in patterns of student success over time. Importantly, we heard from CCSF institutional research staff about the need to work directly with faculty—one-on-one, in small groups, and by departments—to help them envision ways to use the information; the promise, that is, lies not only in supplying good information but in cultivating a demand for it. A study of the DSS system found that the increased availability of data has produced a shift in how individuals imagine their role in using information for decision making.
The Carnegie project meeting generated enthusiasm for further bridge-building, as well. As more and more faculty embrace the scholarship of teaching and learning and begin gathering evidence about their students’ learning, it’s exciting to think about how rich, qualitative classroom-level information can be captured and integrated into larger data systems that others on the campus can access and build on. What may be needed is not an information superhighway but a friendlier set of neighborhood paths and backstreets that take people where they need to go as educators. This, in turn, may require a different way of organizing the work of institutional research—and resources to support its more central role.
To readers who do not work on a campus, all of this may sound like inside baseball. It’s not. Questions about who talks to whom, and about what kinds of information are institutionally valued and available, are central to an institution’s capacity to improve. And while the availability of data is never a sufficient condition for improvement, it is certainly a necessary one. Community colleges—with their “can do” attitudes, and their willingness to experiment—may well have things to teach the rest of higher education about the best ways to think about the evidence needed for improvement.
Carnegie Perspectives, Developing Questions Comments Off