New Visions for Public Schools

The Adjacent Possible, Part II: Designing an Early-Warning System for Tracking Student Performance

In a previous post, I described how taken I am with the idea of the adjacent possible, a concept first proposed in evolutionary biology that I think has applications elsewhere. 

At New Visions, our data work is as subject to the principle of the adjacent possible as anything else. When we began working as a public school support organization in 2007, we suddenly found ourselves awash in data. Our schools needed a way to make sense of all this information, particularly so that they could track student progress and help those students who were "off-track."

I've come to realize that data are sort of like molecules: credit accumulation, attendance rates, demographic information, exam scores, passage rates, graduation rates, marking period grades, behavioral infractions. These are the basic building blocks of most early-warning systems designed to help schools monitor their students' progress.  This was our primordial data soup.

To fit these pieces together to form something meaningful, we needed a blueprint.  We turned to research out of the Consortium on Chicago School Research (CCSR) and other key literature from the field (for example see Balfanz and Legters; Neild, Balfanz and Herzog; Pinkus). 

The problem: when students fail to hit critical benchmarks (e.g. high attendance, continuous credit accumulation, high grades), the likelihood increases of their dropping out of high school or not enrolling or persisting in college.  CCSR found that students who hit their freshman benchmarks are three-and-a-half times more likely to graduate in four years than students who don't. Relatedly, we know that students who fail to hit those freshman benchmarks are more likely to drop out of high school later on. 

This problem is made much, much worse by constantly changing graduation requirements (eg. NYC's graduation requirements have changed 7 times in the last eleven years); by parents and students who don't know what the graduation requirements are (in NYC, approximately 15 percent of public school students are identified as having limited proficiency in English); and by fragmented data systems and school staff unskilled in data analysis.

Once we wrapped our head around the full scope of the problem, the question we asked ourselves became "How do we make this problem visible?"  How do we combine our core student data (credits, grades, exam scores) in such a way that we can quickly identify a student who takes a wrong turn? How do we create data tools in a simple, clear way such that student performance comes through regardless of language barriers? How do we integrate these data so that the critical information is accessible and transparent to educators and only a click away?  Basically, what were the steps we needed to take to put an early-warning system into place?

At New Visions, we did three things. We identified clear benchmarks that students need to hit in order to be ready for graduation and college; we created multiple tools for multiple audiences, such as parents, teachers and guidance counselors; and we provided data on demand via the DataCation platform, a web-based student information system that was designed by teachers. We also talked to our principals, teachers, guidance counselors, data specialists, students, and parents. Listening and learning from the folks in schools, those on the front-lines, matters more than anything.

In short we created what Stuart Kauffman and Steven Johnson refer to as first-order combinations, the first step in the ongoing "evolution" of our early-warning system. That is, the steps we took to categorize and visualize a student's progress to graduation (on-track, almost on-track, off-track) are inherently tied to the way we framed the question: How can we identify the students at-risk of failing to meet critical academic benchmarks?  This represents a first-order combination.  By combining our data molecules around this question, we could now see more deeply into patterns of student performance within a school. This early warning tool allowed us to take "stock" of a student's progress at a moment in time and gave educators a tool for identifying students who were at risk of not graduating due to lack of credits, Regents exams, etc.

Having these new data tools in hand then allowed us to ask a different question. What are the invisible systems in schools that might be systematically producing risk – and how can we build on our first-order combinations?

Lately, we've taken these first-order combinations and tinkered with them, recombined them, reimaged them. We've been opening doors and pushing the boundaries of our adjacent possible.  Re-imagining our first generation early warning tools allowed us to develop second order combination -- thereby expanding our adjacent possible.  We've infused our work with art and have been collaborating with the ingenious, wildly talented graphic designers Sarah Slobin and Andrew Garcia Phillips.

These second order combinations have allowed us to open a new set of doors and have taken us in a very interesting, provocative direction.

In my next post, I’ll introduce you to the new, second order tools we're developing and the very different problem in schools that these tools are helping to make visible.

Susan Fairchild is director of program analysis and applied research at New Visions. Follow her on Twitter at @SKFchild.
 

comments powered by Disqus