What is a good school?
There’s no simple answer. Families have different priorities. Children have different needs. A school that’s perfect for one student may be perfectly awful for another.
This is a real challenge when it comes to rating schools, which aren’t good or bad in an absolute sense, typically. If we can’t capture everything a family might value, should we not rate schools at all? We could go that way. But then how would families make informed choices about where to send their kids?
Louisiana has been grappling with this issue for many years. It uses a basic A-F system to rate its schools. The grades for elementary and middle schools are determined mostly by student pass rates on the annual state exams. They also consider how successfully schools help students with low test scores improve to a passing score, but to a much lesser extent.
For many families, Louisiana’s school grades convey useful information. If you want to know whether most students at a given school are passing state tests, the grades provide a composite picture across grade levels and subjects that would be difficult for a parent to assemble alone. When a school that had been performing at a B level dips to a C, families note the change and may ask why.
But just because there’s one reasonable approach to grading schools doesn’t mean it’s the only way to grade them, or that the current approach tells us everything families might want to know. How we perceive a school’s performance depends on the lens through which we see it.
Imagine two B-rated schools, for example. They share the same letter grade, but one earned its grade by growing lower-performing students up to B level while the other inherited students already at that level and did not grow them at all. To a parent, which one is the better school? If you are a parent with a child who is coming in below grade level, it’s the one that appears to be more successful growing its students.
So what if we calculated school grades differently?
When we work with families at EdNavigator, they are more interested in how much a school helps students grow than how many students demonstrate proficiency overall. What if we gave greater consideration to a school’s success in growing its students?
We did just that. Using school by school growth data from the Louisiana Department of Education for the 2014-15 school year, we re-calculated grades for New Orleans public schools serving students in grades 3-8.
Our approach was pretty simple. We took the original School Performance Score (SPS) from the state, which is used to determine a school’s official letter grade, and added its growth score for math and reading.
What’s a “growth score”? In a nutshell, each student gets a projected score for the state math and reading test. The projection estimates how the student is expected to perform based on a host of information including how the student has fared in the past and how much similar students typically grow over the course of a year. By looking at how often students in a given school meet or exceed their projected growth, the state arrives at a composite picture of growth. In some schools, 60 percent of students might meet/exceed their target. In another, it might be just 40.
Imagine a school whose official SPS is 80.0. That school would have received a letter grade of C from the state. But that school had 65 percent of students meet/exceed their growth target for math and 70 percent meet/exceed in English Language Arts. We added those three numbers together:
80.0 + 65 + 70 = 215.0
The school’s “adjusted” SPS is 215.0. We did the same calculation for each school, then re-distributed the letter grades based on the new performance scores and a revised scale. We kept the same number of A’s, B’s, C’s, D’s, and F’s. Six schools earned A grades from the state, for example, so we also assigned six A grades in our adjusted model.
Changing the formula to express greater value for academic growth changes the overall picture. Some schools move up because of the strength of their growth scores while others move down. In some cases, schools that were sure bets according to the original grades now look less certain; others that we might have avoided now seem worth exploring.
To be clear, this is a back-of-the-envelope approach that is not (and was never intended to be) statistically sophisticated. No doubt there are more scientifically sound ways to weight student growth more heavily in the grades, and we hope someone with greater technical expertise does just that. Our priority was simply to incorporate information that parents care about into an existing system. It’s a different way of thinking about school quality that helps our Navigators and our families make the extremely important decision of where to enroll their children for school.
We could choose another approach to assigning grades. In the future, we may do that. There are pros and cons to any formula. Imagine that we had other data to consider, like composite scores for each school based on family engagement and satisfaction surveys. That could change the ratings even more, in fascinating ways. For now, we thought we’d share our thinking. On Thursday, we’ll discuss some of the interesting trends we noticed when we re-graded the schools.