How Michigan, Florida compare

Michigan vs Florida

A series of reforms over the past 15 years has produced huge academic gains for students in the Sunshine State, while scores for Michigan students have remained relatively flat. Florida has passed Michigan in 4th-grade reading, and 4th- and 8th-grade math.

4th grade math

8th grade math

All students All students
Poor* students Poor students
Not poor Not poor
Black students Black students
White students White students

4th grade reading

8th grade reading

All students All students
Poor* students Poor students
Not poor Not poor
Black students Black students
White students White students

Source: National Assessment of Educational Progress.

* Note: Students identified as poor are those eligible for a free or reduced lunch. Those identified as “not poor” are all other students.

Comment Form

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.


Chuck Fellows
Tue, 09/30/2014 - 10:12am
Wrong! Totally wrong. Florida changed its system of measurement. No comparsion. Besides, scores on standardized tests of any kind, especially when aggregated, are totally and completely meaningless. There is absolutely no actionable information contained in the data provided. It is data without context, no context, no meaning. Learn adults - you cannot ascribe quantitative measures to qualitative actvities (i.e. learning) no matter how hard you try or repeat "The Ghost of Pythagoras". Get some profound knowledge instead of seeking instant pudding answers that satisfy your limited worldview.
Carl Fry
Tue, 09/30/2014 - 2:51pm
Really? We can't quantify education? Does the dollar bill in your wallet represent a quantitative or qualitative measure? I would suggest both. It is a quantative measure by which you can purchase very specific amounts of very real items. It is a qualitative measure which the world prices daily relative to other currencies and economies. I suggest similar measures can and should be meaningfully applied to the output of educational institutions. Yes, education is a complex process, but one that has always been measured (report cards?) and always should be measured. There is nothing wrong with having an intelligent discussion over evolving measures of accountability. Let's fine tune the process rather than simply rejecting the first really comprehensive efforts to quantify the most important product/process in our society.
Steve Smewing
Tue, 09/30/2014 - 6:08pm
Since when has education not been measured by standardized testing. Chuck you make strong claims, perhaps you could link to your evidence so others can see whether you just hold strange opinions or you come from fact based evidence. As it stands I cannot help but see your opinion as pure rubbish.
Leon L. Hulett, PE
Thu, 10/02/2014 - 8:40pm
Chuck Fellows September 30, 2014 at 10:12 am I don't think I understand your “The Ghost of Pythagoras” reference. Are you saying someone is trying to divine a truth from something that obviously has no such truth in it? If that is true, then here is a suggestion I think will fill that need. The need being, how to take something qualitative and demonstrate it in a context that has meaning for the student. One student demonstrates an idea to another student with actual objects. First the student demonstrates how he understands the idea applies to something in his life. Then the second student asks the first student to now demonstrate how the idea applies to a context provided by himself. The teacher could supply the contexts for this student to read. As a Tutor I supply the contexts from my years of experience in industry and life. I ask two students to start out simple, and we work up to much more complicated examples. Then I have them repeat the demonstrations until they can do one quickly and easily. I believe at this point the student would understand the idea well enough that he should have no trouble answering questions such as NAEP or the ACT. But as you said, what value do such tests actually represent?
John Q. Public
Tue, 09/30/2014 - 11:33pm
I don't know how long it will take--a month, a year, a decade--but I'm patiently waiting for the inevitable investigative report that reveals how the numbers were massaged, manipulated, or outright lied about. How the inputs were "managed" to reach predetermined outcomes. How students were informed of the test questions in advance, or were "taught" directly from the test questions. I don't have any peer-reviewed data, or any other "studies" to suggest this is true, but I'm willing to bet that way anyhow.
Leon L. Hulett, PE
Thu, 10/02/2014 - 8:16pm
Mike, 2014 ACT Scores for Michigan (20.1) and Florida (19.6) do not support your premise. Also, the percent of students tested, MI (100) and FL (81). Leon
Jerri B
Thu, 10/02/2014 - 8:22pm
For the naysayers, please review the NAEP website at: The following information is from their homepage and provides information that is informative and sheds light on the data used for this report. Read and learn..... "The National Assessment of Educational Progress (NAEP) is the largest nationally representative and continuing assessment of what America's students know and can do in various subject areas. Assessments are conducted periodically in mathematics, reading, science, writing, the arts, civics, economics, geography, U.S. history, and beginning in 2014, in Technology and Engineering Literacy (TEL). Since NAEP assessments are administered uniformly using the same sets of test booklets across the nation, NAEP results serve as a common metric for all states and selected urban districts. The assessment stays essentially the same from year to year, with only carefully documented changes. This permits NAEP to provide a clear picture of student academic progress over time."
Leon L. Hulett, PE
Thu, 10/02/2014 - 8:56pm
Jerri B October 2, 2014 at 8:22 pm Are you saying that as an employer, if I ask a new graduate/employee, to demonstrate something they had learned by having them do a task for me, that the NAEP will be an accurate predictor of whether they will be able to do the task efficiently? I don't think so. If you would like to try this out we could take the easiest task available and have this new individual with no company training attempt to do this task. You watch what happens. Leon L. Hulett, PE
Jerri B
Fri, 10/03/2014 - 9:24pm
"I" am not saying anything. "I" am simply showing information based on the fact that this study used National NAEP scores. A point some of the prior comentators seem to have missed-including yourself.
Leon L. Hulett, PE
Sat, 10/04/2014 - 9:01pm
Jerri B October 3, 2014 at 9:24 pm I am very aware of the NAEP for the last 20 years. Did you know they (NAEP) said in 1994 that only 5 percent of American high school graduates are at grade level in math, and that 50 percent of math students drop out each year? My point is that from observing many new-hires as their first manager they are not well prepared to work productively. The large company I first hired in to has a six-month training program. A manager at a local CAD company says it takes five years to bring a newly graduated CAD employee on board and bring them up to speed. I estimate American business spends as much as the entire public education budget each year, re-educating employees to be able to do the work needed. I feel this is an outlandish amount of money [a penalty] to have to spend to get productive employees that should have been been better prepared in K-12. You have boldly presented your NAEP reference and chided me personally for apparently having missed your point. Do you feel you really appreciate what the NAEP is telling you? Best regards.