From Data to Design-Driven Decision Making: A New Paradigm for Schools

Accountability systems like those mandated by the federal NCLB act have resulted in exponential amounts of data.  Educational stakeholders, from practitioners to policy makers, have been working to leverage these data to improve school, teacher and student performance, a practice known generically as data-driven decision making.  (Marsh, Pane and Hamilton’s "Making Sense of Data-Driven Decision Making in Education" is a good resource for more information on this topic.)

Data-driven decision making takes as its point of departure the tip of the iceberg (the “event” or the “fire”) in the Iceberg Model and moves down with the goal of exposing its root cause.  Indeed, the very point of data-driven decision making is to use data to inform our strategies or next steps such that our outcomes improve.

But data informed decisions used simply to correct problems (or eliminate the fires) are inherently narrow. As mentioned in a previous post, Ackoff calls this “reactive problem solving.” We “walk into the future facing the past” when we are overly reliant on this strategy.  While, we certainly do not want to abandon data-driven decision making, it may be worth considering a second, equally important approach, what we’re calling design-driven decision making.

The design-driven decision making approach tackles the iceberg at a different level: the structures and mental models that underpin the system.  By focusing on the structure and design of the system, we can begin to anticipate outcomes. 

The tool that we use is Systems Thinking. Barry Richmond’s (1994) definition is useful: “Systems thinking is the art and science of making reliable inferences about behavior by developing an increasingly deep understanding of underlying structure” (p. 6). In using Systems Thinking in education, we are developing structural theories about how schools works to intervene so that we create high performing schools. 

Because the behavior in the real world contains time delays and feedback loops, we use Systems Thinking to develop theories more consistent with those realities.  Rigorous application of Systems Thinking includes applying stocks and flows to represent accumulations (of students, skills, data, IT capacity).  We include feedback loops to drive dynamic behavior (e.g. increasing performance leads to increasing motivation and thus increased skill development…and thus increased performance).  The combination of stocks, flows, and feedback loops can represent the underlying structure responsible for complex, real world behavior.

Many systems have similar structures that produce common and predictable patterns of behavior, what Senge (2006) calls “archetypes”: storylines with common, universal themes resulting in negative consequences that present over and over again. One key archetype in education is the "erosion of goals." In such a scenario, gaps between current performance and desired achievement sometimes generate pressure on stakeholders to improve via corrective action. But corrective action is often difficult, time consuming, and/or expensive. In such cases, the response may be to lower expectations and goals, since this can often be done quickly and cheaply. Erosion of goals relieves pressure and reduces the need for corrective action, but often results in a vicious cycle of worsening performance (a downward spiral of lowering goals, then performance, then goals…).

The data generated by archetypes are often predictable. To illustrate this point, we invite you to explore the interactive Systems Thinking model below that maps out the erosion of goals archetype as well as the data that alert us to its presence (which we draw from a report in DataCation).

The model looks at how students’ credit accumulation is often not aligned to pass rates on New York State Regents exams (the yellow bar below).  That is, students are passing a Regents course but failing the exit exam that assess content and performance attainment.  In this model, we take as a point of departure that the Regents exams are “robust” and do a good job aligning content and performance standards (and, for sure, assessments do not always do this).  The point, however, is not to get caught up in the testing debate, but to explore system behavior using a prevalent example in the field of education and how the design of a system drives its behavior. To explore the model, click on the image below.

In our next few blog posts, we’ll consider the implications of the erosion of goals map for educators and some strategies educators can use to address this phenomenon.

Susan Fairchild is director of program analysis and applied research at New Visions. Follow her on Twitter at @SKFchild.

Chris Soderquist is the founder of Pontifex Consulting, helping clients apply systems thinking to their adaptive challenges.  Follow Pontifex Consulting on Facebook at



comments powered by Disqus