Here's one of those times the original stats make you look really good but once you dig into the data it's more of a "meh" experience. I'm going to preface this short post by saying I don't know how kosher it is to talk about test data in the larger math teacher community. I've taught in failing schools. I've taught in stellar schools. Test data is something that mattered in each of the 6 districts in which I've taught. A lot. It's something we parse and mine and talk about over and over again. So, here's me laying out the good and the bad of last year's data.

Last Spring's 2015 PARCC results for students performing proficient or above:

MATH:

Last Spring's 2015 PARCC results for students performing proficient or above:

MATH:

- Algebra 1 - 93% (our score) / State 30% / district 49%
- Geometry 75% (our score) / State 24% / district 43%

Background: Almost all of our 8th grade students are in an Algebra 1 equivalent or higher. This means our 8th grade students took subject specific tests (Alg1, Geom, etc) rather than the general 8th grade math test. |

At first glance, this data makes our school look flipping awesome. But when you start wading into the massive excel files you realize the data isn't as brilliant as it seems. We're a middle school. Our Geometry students and our Algebra students are advanced compared to the average high school student. Of course our scores are better than the local high schools on subject-specific tests; our kids still do homework. When you compare us to other top-performing middle schools, the story changes.