Opinion | Take a breath: Michigan schools’ tests scores aren’t as bad as you think

Ron Koehler is assistant superintendent at Kent Intermediate School District. Sunil Joy is a researcher at Kent Intermediate School District.

Thirty-two percent of Michigan’s fourth-graders score at the seventh-grade level on the National Assessment of Educational Progress test.

Did you miss that headline recently? I did, too.

That’s because it wasn’t reported. Instead, Michigan’s schools and students were slammed by pundits because just 32 percent were “proficient” on the NAEP. To be proficient at the fourth-grade level, students must answer seventh-grade questions.  

If you think that is confusing, or just doesn’t sound right, join the tens of thousands of educators who think the same thing.  

It was reported – and is commonly accepted – that Michigan’s schools are failing and falling behind based on the NAEP scores released April 10.  What is not reported is what the NAEP scores measure and, more importantly, what the student performance measures represent.

Scores on the NAEP are measured on a three-part scale, according to the National Center for Education Statistics (NCES), the federal entity that administers the NAEP. Student performance is reported as “basic,” “proficient,” and “advanced.”  What do those words mean to you? With the release of the 2017 results, the associate commissioner of NCES, Dr. Peggy Carr, admits that looking at whether students are “basic” rather than “proficient” is a better indicator of students meeting grade-level standards.

Unfortunately, the NAEP scores are never reported in that fashion.  Proficiency, and the percentage of students who perform “above” grade level, is the coin of the realm.  Researcher James Harvey, in his Education Leadership article the Problem with Proficient, reports the bar is so high that few nations in the world – as with our highest performing states ‒ would report more than 50 percent proficient if their students took the NAEP.

In Michigan, if students performing at grade-level were the measure, the percentage of students meeting this benchmark would be much higher. Again, these rates don’t vary much from the nation either:

We are a victim of what researchers have dubbed misNAEPery.

Here are some additional considerations regarding the recently released NAEP scores:

Small differences among states make rankings problematic

NAEP administers its assessment to a small sample of students in each state—much like public polling. Like public polling, there is always a margin of error associated with the results. And because we know scores between states can be similar, these margins can make a big difference.

For example, in fourth-grade reading, 19 states have results that are not statistically different than Michigan’s average. In fact, across three of the four subjects and grade-levels, there is no statistically significant difference between the national public average and Michigan’s average. Moreover, there is no statistically significant difference between Florida and Michigan’s scores in eighth-grade reading and math—despite Florida having been heralded as a national “bright spot” by U.S. Secretary of Education Betsy DeVos.

Improvements in scores over two or more years is not the same as student growth

Because NAEP is administered every two years, many misinterpret the changes in scores as student growth. Student growth refers to the improvements or declines for a single student or cohort of students over two or more points in time. NAEP, however, does not administer the assessment to the same students over time. In fact, the students sampled don’t even take the full assessment, instead complete a sub-portion using a technique known as “matrix” sampling. This is to the frustration of local school leaders, as NAEP’s assessment design prevents local schools from receiving their own results.

Causation is not the same as correlation

Differences in NAEP scores can be due to a countless number of factors, ranging from variations in poverty levels to changes in policies. However, the temptation to jump toward conclusions—particularly over short-term changes—is often too hard to suppress. This exact scenario may be playing out in Tennessee. In 2013, the state was lauded as the “fastest improving” in the nation as a result of its massive education reforms in previous years. Unfortunately, this didn’t carry into subsequent years, with 2017 results being described as “disappointing” by local stakeholders.

Given these complexities, should we toss out NAEP altogether? No. NAEP provides states with a general sense of how their students are performing. It can also open the door for further investigation of trends and the areas worth monitoring.

But scoring states, and scaring policy-makers and parents with scores and stories that make it appear as though their schools are failing, does a disservice to our students, educators, schools and districts. There is much improvement to be made, and Michigan’s schools must get better if we are to restore our mitten to the economic powerhouse it once was.  The improvements to be made are much more manageable if we are honest about our scores and our shortcomings.

Bridge welcomes guest columns from a diverse range of people on issues relating to Michigan and its future. The views and assertions of these writers do not necessarily reflect those of Bridge or The Center for Michigan.

Like what you’re reading in Bridge? Please consider a donation to support our work!

It takes time, money, and hard work to inform Michigan readers and leaders with substantive, in-depth, future-oriented news and analysis. If you value our journalism, please consider a one-time donation or a monthly contribution. It takes just a moment to donate here. Please join the thousands of Bridge readers who are helping grow and sustain our nonprofit, in-depth public service journalism in Michigan.

Pay with VISA Pay with MasterCard Pay with American Express Donate now

Comment Form

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Comments

Kevin Grand
Fri, 06/01/2018 - 12:56pm

Have we read the same report?

According to the numbers that I've read earlier this week, the overall state numbers were essentially unchanged.

https://www.nationsreportcard.gov/reading_2017/#states/scores?grade=4

https://www.nationsreportcard.gov/math_2017/#states/scores?grade=4

And in districts like Detroit, the overall test numbers had actually fallen (even after a republican-led Michigan Taxpayer bailout).

https://www.nationsreportcard.gov/reading_2017/#districts/scores?grade=4

https://www.nationsreportcard.gov/math_2017/#districts/scores?grade=4

duane
Fri, 06/01/2018 - 8:18pm

When you only look for failure or disappointment that is what you will find.
If you want success you must look for it. There are the thousands of learning success stories each year in Michigan and we waste each one of they by ignoring what we can learn from those success and how we can use those lessons to help others to succeed.

The most important thing we must do from this article is interview those 4th grades, asking then how and why they succeed at reading. Asking them if they read and to describe how they read [how often, how much, where, what, why]. Asking them what are the barriers to their reading, how do they work around those barriers, why they make the effort to work around them.

The authors mention the difference between corollaries and causation. We need to better understand the causation, and the only way we can help the underperforming is by understanding the causation of success for without understanding what makes success we have no reference to know what is causing disappointment, we have no way to be sure what needs to change.

Do you believe that the student has a role/responsibilities in the learning? Do the reports we keep see mentioned describe those roles/responsibilities, do they describe how the successful students are fulfilling them and the unsuccessful are? Do the reports describe the learning process and how the students are conforming to those processes? The fact is that all the numbers only offer correlation to results and fail to provide descriptions of causation.

Chuck Fellows
Sat, 06/02/2018 - 11:35am

The only realistic conclusion to be drawn from the NAEP report is that pols and pundits are gripped in a frenzied state of metrication. Gotta have those numbers so we can punish our way to success!
The NAEP must not to be used to compare, nations, states, districts, schools, teachers or students, according to the NAEP. So right off the top all this moaning and groaning is ridiculous.
The long term trend in NAEP results is up. Surprise, surprise. Really rather remarkable since all those moans and groans come from those who keep moving the goal posts and insist on prescribing in great detail what to teach, when to teach and how to teach.
So all the hand wringing about scores is meaningless.
And where is it ordained, in what credible research, that children are supposed to read (or any other academic discipline for that matter) by a date or age certain. They learned the complexities of language and the digital age without out interference. Maybe, just maybe the real barrier to learning growth is our interference. Pogo was correct.
Want to continually improve? Listen to the teachers. Insure they have the authority commensurate with their responsibility (Curriculum, pedagogy and assessment). Let the children follow their learning journey , not ours. Flip the funding; operational budgets developed at the classroom level by teachers supported by all our bureaucratic drag; legislature must fund those budgets. Capital resources such as buildings and equipment shall be funded by the state with the state's responsibility to insure all capital resources are equitable at the level of our 55 hold harmless school districts. All the revenue is already being collected - and wasted in our one size fits all system with mismanagement of pension promises and funds the result of legislative game playing.
Most of all, we need to stop being so stupid trying to impose upon our children our worldview. It is their learning. Let them be responsible for it.

Gary Meehan
Sat, 07/14/2018 - 10:21am

Well put...

Elizabeth L.
Mon, 06/04/2018 - 8:17am

While this editorial is not wrong, it is misleading. The NAEP results provide lots of data we can use to try to understand how students are learning in Michigan. To make a comparison, you should congratulate yourself for paying your gas bill and electric bill, but if you can't pay your mortgage, what good is it? NAEP is one measure of student performance, but we have additional measures that are telling us the same message about Michigan students. In many districts in my county, 30- 40% (or more!) of 3rd graders would be retained for failing the state reading test, which is not written at a 7th-grade level. Does it make you want to celebrate that a percentage of our students scored as advanced on the state test? Sure! But that's not who needs our energy and attention. The students who do not have basic reading and writing skills fall farther behind each year.
A narrative that "it's not that bad" fuels dangerous arguments that the education funding system is ok, and that some 'eggheads' are just getting worked up over nothing. However, our state testing data (M-STEP and SAT/MME) is not painting a remarkably different picture from NAEP. These test are imperfect. We need to take a hard look at all the information we have about student learning in the state and decide what our next steps will be, even the data that suggests we must change.

Corey Peña
Mon, 06/04/2018 - 6:24pm

Yes, our state tests are written far above a natural grade level! In order to be considered proficient, you must start scoring at around the 56-60th percentile. That is above average. We really need to understand how these exceedingly high benchmarks go against any natural learning curve studied since the beginning of modern learning theory. Anyone awake out there? This is a very positive step in shining a light on how ridiculous the whole standardized testing craze is.