We’re at an interesting moment with the public part of public education.
Years into the “assessment” craze, we’re compiling ever more data about our schools – and providing precious little useful information about how they’re really doing.
If you doubt this, go online and check out the state of Washington’s “report cards” on local schools and districts. They are chock-full of figures, stats, numbers, percentages, test scores – oh, the test scores – and yet, somehow, they offer the typical parent virtually nothing clear and direct in terms of making a judgment about the quality of their kid’s school.
Now boot up the Google for “Colorado Growth Model,” and prepare to be impressed. Don’t take my word for it – listen to what Alan Burke, deputy superintendent of the Washington Office of Superintendent of Public Instruction, has to say about it.
“It’s pretty cool,” he said. “We’ve been interested in the growth model for quite a long time.”
A new assessment model based on measuring the year-to-year improvement of each student and then compiling that into easy-to-use, plotted comparisons between schools and districts statewide, the Colorado model is a dramatic step forward from the one-test, one-yardstick approach.
It doesn’t just tell you which schools are scoring high on the tests – which you might simply guess, unfortunately, by looking at how expensive the homes in the neighborhood are – but it also uses a complicated calculation to produce simple, comparative graphics showing which schools are helping students improve the most.
The Seattle schools recently released a similar warts-and-all picture of how its schools are doing and which schools are showing the most student improvement.
“It’s probably the best thing out there in terms of really understanding what schools are doing,” said Christine Campbell, a researcher at the University of Washington’s Center for Reinventing Public Education. “We need to do things dramatically differently. We’re losing an entire generation of kids.”
For now, though, adopting a Colorado-style model will have to remain on a wish list for a Christmas in some distant, post-recessionary year. Colorado’s software is open-source and free to anyone, but it would cost the state about $450,000 to gather and organize the right data to get it started, and we’re too busy not raising taxes and cutting basic services for the poor to be wrapping up this particular present right now.
Still, we ought to do it as soon as we can.
Idaho might be a little closer to adopting a “growth model,” since it’s part of a grant-funded consortium of states exploring better ways to chart school progress. The Colorado model is but one of the options being considered, said Melissa McGrath, spokeswoman for the state’s Department of Education.
“Idaho wants to move toward a growth model,” she said. “Colorado’s is one we’re looking at. I think every state is looking at that.”
Richard Wenning, associate commissioner of Colorado’s Department of Education, said that a dozen states are in the process of adopting the model.
“It’s most beneficial to do this on a statewide basis,” he said. “We’re really trying to change the whole entire manner in which states understand the performance of their schools.”
Efforts to measure and assess the quality of schools are always fraught with landmines. No Child Left Behind remade schools in ways that can’t be considered positive – the oft-cited teaching to the test, the wholesale adoption of exam performance as our single measure of learning – and yet the complete and total resistance to assessment from some in education can feel like a complete and total resistance to improvement.
It feels a little too patient. I liked something Campbell said about patience – we need to lose it, when it comes to poor performance.
The great thing about the Colorado model, which has support from the teachers unions there, is that it does not measure performance as simply this year’s score on the statewide test. It measures a variety of goals and standards, and it places each student among his or her peers to measure progress.
The score isn’t the thing. The rate of improvement is.
“It keeps us from taking the one day the kid takes the test – or doesn’t take the test and gets a zero – and saying that measures their entire education,” said Jeanne Beyer, spokeswoman for the Colorado Education Association.
This model also has the potential to expose the schools that are beating the odds in specific, public ways. The recent release of the Seattle schools’ data, for example, shows a lot of expected results: The poorest neighborhoods in town had the worst performances, and the wealthiest did the best, broadly speaking.
But the exceptions were instructive. Campbell noted a couple of schools that had really bucked the trend, showing high rates of improvement despite operating in impoverished neighborhoods.
Socioeconomics “are not the end of the story,” she said. Educators can use the growth model to identify the standouts and ask themselves:
“Here’s an example of a school in the city that is beating the odds. What are they doing?”