Voices in Education

Academic Return on Investment: Spending Only on What Works
Duh! Who wants to spend money on what doesn’t help kids learn? No one. But how many superintendents and school boards know what actually works in their district? Not too many.

Two truisms help frame this discussion: (1) no one knowingly funds strategies, efforts, or programs that they don’t think will raise student achievement; and (2) despite the massive infusion of spending into K–12 education over the last twenty years, nationwide achievement has been mostly flat, making it apparent that much of the new efforts haven’t been all that effective. Since 1992, eighth-grade NAEP reading scores barely budged in the face of heroic efforts and spending to increase student literacy, and have been virtually flat since 1998.

In good times, the typical annual district budget discussion goes something like this: “We need funds for math coaches to help implement the new math program, and grant dollars to expand professional development in reading, and resources to start a dropout prevention program, and more staff to add paraprofessionals to help struggling students.” In each case, the asker is certain these efforts will raise student achievement.

In bad financial times, the conversation sounds more like this: “Don’t cut the math coaches,” says the math director, “they are crucial to our efforts to raise scores in math.” “Don’t cut professional development; it’s the lifeblood of improvement,” says the director of curriculum and instruction, “and, clearly, taking money from students at risk of dropping out would be cruel.” And on it goes.

This isn’t just administrators protecting their staff—it also reflects a deep-seated belief that these efforts are important and effective. In the end, whoever can tell the most persuasive story will carry the day. Imagine, however, if before the budget conversation began, the director for data and assessment made the following short presentation:
  • Students of teachers receiving support from math coaches gained three and a half months of more learning than students of teachers who did not get coaching. Additionally, in 87 percent of unannounced observations, teachers who received math coaching were observed implementing the program with fidelity.
  • Students of teachers receiving reading professional development (PD) fared no better than students of staff who didn’t receive the professional development. Moreover, only 15 percent of unannounced observations revealed the PD content being put into practice.
  • Of the students entering the dropout prevention program, 75 percent of those with grade-level reading ability (but with significant social, emotional, and/or drug issues) graduated, while only 5 percent of students with significant learning and reading deficits eventually graduated.
After such a presentation, it is very likely that math coaches will stay (or expand), reading PD will be stopped (or changed), and the dropout program will remain in place for one type of students, but not all. The problem is, this kind of hard data is seldom available, so debates and passionate pleas become the norm. Imagine if the presentation went on to share this statistic:
  • Teachers who received twelve hours of coaching had similar results to those who had thirty-six hours of coaching support.
The district might cut back on coaching without fear of losing the benefits.

This post is excerpted from
Smarter Budgets, Smarter Schools: How to Survive and Thrive in Tight Times by Nathan Levenson (Harvard Education Press, 2012).

About the Author: Nathan Levenson is managing director of the District Management Council and a former school superintendent.  He is the author of Smarter Budgets, Smarter Schools: How to Survive and Thrive in Tight Times (Harvard Education Press, 2012).