Michigan's overachieving school districts are State Champs

Seventy-three Michigan school districts earned top marks from Bridge Magazine as 2014 Academic
State Champs.

The winners – comprising the top 5 percent of 507 school districts across Michigan – include tiny rural districts and large metro ones, impoverished districts and the more affluent, charter
schools and traditional public schools.

Districts are ranked according to the most detailed data
analysis we’ve ever conducted, taking into account grade-level test results and student income.

Our analysis reveals small triumphs in unlikely places, such as Ashley Community Schools, a tiny rural district in Gratiot County, about halfway between Lansing and Mount Pleasant, which has succeeded despite a student population in which nine out of 10 students are eligible for free or reduced lunch. Ashley is the state’s 4th-ranked district.

“This shows that we’re doing good things with kids, even if our circumstances are different from a Bloomfield Hills,” said Superintendent Tim Hughes, referring to the Oakland County district where fewer than 10 percent of students are eligible for subsidized lunch (Accounting for wealth, the Bloomfield Hills district was ranked 33rd overall and its district high school rank was No. 6).

The methodology and data analysis, independently developed by Public Sector Consultants, a public policy research firm in Lansing, takes into account the impact of poverty by analyzing how
districts across Michigan perform compared with districts of similar socioeconomic levels, an
acknowledgement of the debilitating effect that poverty typically has on student achievement.

Yes, this means some poorer districts are State Champs even though they have lower (in some
cases, dramatically lower) test scores than more affluent districts that didn’t make the cut. Why? Their students more significantly outperformed their peers, when income is considered. Are
these ratings perfect? Of course not. Other factors can also impact achievement, including the
level of pre-K education and cultural and language differences. Still, Academic State Champs, now in its fourth year, allows Bridge to go far beyond raw test scores to more fairly compare performance in schools and districts across all income levels.

Going deeper this year

Today’s school district awards are only the beginning.

Next Tuesday, Feb. 10, Bridge will release its first State Champs for individual schools, based on deeper analysis of school-level test score trends and poverty data. Bridge’s analysis this year is made possible by underwriting from Herman Miller Cares. (Donors and underwriters to the Center for Michigan and Bridge have no control over editorial content.)

Bridge is able to go deeper this year because we’re crunching student data in more grades
than ever, using test results from the state MEAP and Michigan Merit Exam and the high
school ACT. In past years, Champs was based on testing across three grades; this year, we’re
analyzing test results in eight grades.

That means we can reward overall district excellence, as we have in past years, but also
recognize leading districts at the elementary, middle and high school levels. It also means
Bridge and MLive readers can dive more deeply into our easy-to-navigate district database
to see how districts compare to districts of similar size, location, income
level and other factors.

Some notable district findings:

  • Ann Arbor schools, with more than 16,400 students, was the highest ranked large school district (more than 10,000 students). Its students succeeded in almost every grade and subject and its high school students did among the best on the ACT, both in raw numbers and adjusted for poverty.
  • In Detroit, the Martin Luther King, Jr. Education Center Academy, a K-8 charter school authorized by the Detroit public schools, was ranked No. 1 in both its elementary setting and its middle school setting. (In Michigan, charters are counted as districts, even when they are a single school). More than 90 percent of MLK students are eligible for
    subsidized lunch.
  • Okemos schools were top ranked among higher-income districts. Its high school was
    among the state’s best, with more than half of its juniors already considered college
    ready in all four ACT subjects – exceeding rates in similarly wealthy districts.

Why poverty

Why do we focus on poverty? Because numerous studies have shown that Scatter2poverty and income are some of the best predictors of student success.

So it is in Michigan. For example, look at the scatter plot at the right. It shows how Michigan schools with students at different income levels performed in 4th-grade math in 2013. As you can see,
very few low-income schools performed well, and very few higher-income schools performed poorly. It is from such analyses that researchers can predict how a school is likely to perform according to income, calculations that form the basis of Bridge’s Academic State Champs.

Enter Bridge’s database and check out our Top 10 lists.

And come back to Bridge and MLive next week for school-level rankings.

Happy exploring. And congratulations to this year's Academic State Champs districts.

Like what you’re reading in Bridge? Please consider a donation to support our work!

It takes time, money, and hard work to inform Michigan readers and leaders with substantive, in-depth, future-oriented news and analysis. If you value our journalism, please consider a one-time donation or a monthly contribution. It takes just a moment to donate here. Please join the thousands of Bridge readers who are helping grow and sustain our nonprofit, in-depth public service journalism in Michigan.

Pay with VISA Pay with MasterCard Pay with American Express Donate now

Comment Form

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Comments

Tue, 02/03/2015 - 10:12am
Dear Bridge Magazine: I read with interest your article on successful schools and the socio-economic factors affecting test results. Your attempt to "throw good schools under the bus", has failed. Last September, Newsweek magazine recognized Rockford High School as one of the Top 500 high schools in America. Number 247 to be exact. The criteria used by Newsweek, included schools whose students scored at or above the 80 percentile on standardized assessment tests within each state. Schools were also assigned a college readiness score, and then rank ordered based on factors such as, weighted SAT/ACT composite scores; weighted AP/IB composite scores; graduation rates; etc. Unfortunately, you have managed to throw Bridge Magazine "under the bus". Thanks for listening. Dr. Mike Shibler, superintendent of the Rockford Public Schools.
Mike Wilkinson
Tue, 02/03/2015 - 1:26pm
Dr. Shibler, There a number of ways to measure schools and Newsweek and U.S. News and World Report have their methodologies. Ours, obviously, is different and measures how a school district does compared to peers of with similar student family income levels. Here's our methodology: http://cdn.bridgemi.com/wp-content/uploads/2015/01/Achievement-Exceeding.... Newsweek's methodology, if you state it correctly, would mean that no school or district under the 80 percentile on a test could be recognized for succeeding --even if they were in the 75th percentile and a statistical analysis showed that an expected score would be 10th percentile. We believe our model is one that allows us to highlight those districts which have overcome obstacles.
Chuck Fellows
Wed, 02/04/2015 - 8:02am
Using aggregated and manipulated data results in producing no actionable information. A basic rule of data, data without context is meaningless. Studies that uses scores and percentiles are meaningless since there is absolutely no actionable information contained in conclusions drawn from which to continually improve the system. Indicators using this methodology insures that traditional public education will be stuck in the morass of producing a 30% dropout rate and graduates in need of remedial class work in order to progress their education. (If they aren't so discouraged and just give up) Using reports derived from this erroneous process will result in the continual to ranking and rating, identifying children as winners and losers instead of focusing on learning - even Bridge Magazine is susceptible to this grievous error in judgement. If you wish to know if a child is learning ask the child, something a teacher does every day and frequently with formative testing. Tell your resident psychometrician to study the works of Shewhart, Chambers, Deming, Wheeler before they pass statistical judgement on our children.
Bob
Tue, 02/03/2015 - 10:21am
Using the state testing data is no different than posting the height of all the students on a school. There is absolutely no validity studies showing the relationship between state testing scores and schools actual performance. SES trumps all. And any outlier such as the example above is just that - an anomaly! It is an outlier. It adds no helpful information to the questions and concerns of other schools.
Steve
Thu, 02/05/2015 - 5:01pm
So let me make sure I understand you Bob. Schools shouldn't be graded based on test scores because they can't reflect real outcomes. Yet school systems are based on teaching testing grading. Help me out because that makes absolutely no sense.
Chuck Fellows
Thu, 02/12/2015 - 3:22pm
Schools where real learning occurs have frequent teacher formative testing, one on one observation and performances of understanding. Once or twice omnibus standardized testing that this Bridge report relied upon doesn't count for much. Relying on the scores produced by statistically manipulated, psuedo scientific, summative, multi choice standardized tests to rank and rate children or schools is irresponsible especially since the evidence dicrediting the practice of standardized testing is so plentiful and accurate. Sadly, Bridge Magazine and their supporters in this endeavor have relied upon the latter feeding the myth of failing schools that is supporting the multi billion dollar testing industry in America. Very harmful nonsense
Tom
Tue, 02/03/2015 - 1:51pm
Once agian Bridge overinflates charters to promote a narrative that they do it better than public schools. Sorry, not buying it. As a Dearborn Heights resident, it's insulting to see the family-owned racket known as Star International Academy that high on the list. The school is a joke and I don't care how you twist your convaluted scale because those of us that are residents here know how awful their teachers are treated (they can't retain them) and that it's run like a money maker for the family. Their students are not college ready from what I've seen at the next level. Even though the Crestwood schools are ranked very high, there is no way I am sending my kids to the high school that lacks resources (like air conditioning) and discipline. It almost appears you are punishing schools like Novi, Northville, South Lyon, Brighton, Rockford, East Lansing for probably lacking diversity or that the few minorities that come from Detroit, Pontiac, Redford, Lansing, etc., can't catch-up because they are grade-levels behind where they should be. Just like the State holds it against them. I'll take the eyeball test any day over this ridiculous scale you folks keep posting with an agenda behind it. You send your kids to these schools and I'll trust my instincts (with the exception of Ann Arbor, you got that one right, but that's it in your first page).
Carol Hovey
Tue, 02/03/2015 - 2:29pm
I appreciate that you are attempting to separate school learning methods from the obvious advantages available in higher socioeconomic homes. We cannot equate scores when conversation at home, travel experiences, books, technology, even access to paper and pencils are so vastly different. Being realistic means looking at early knowledge when kids arrive in Kdg. and how that grows exponentially for those who are lucky enough to be nourished physically and emotionally in positively reinforced, safe, encouraging home settings full of privileges and hope for future endeavors.
becca
Wed, 02/04/2015 - 1:00pm
I appreciate the last chart for explaining "why poverty". What's the R^2 for that (i.e. what percentage of test scores could be explained by % lower income kids)? Thank you for attempting to identify schools that academically "punch above their weight", so to speak, I wish more people could see the value in it. The real question for Ann Arbor is whether they outperform Champaign-Urbana IL. Either that, or control for parental education instead of family income (could you do a sub-analysis of say just the top 10 districts using parental education instead of income? I'm wondering how much it would change things).
Lynn
Thu, 02/05/2015 - 9:36am
Most of the comments on your article miss the most important part of the methodology used for the ranking of the 2014 Academic State Champs--that the statewide overall rankings are a measure of OVERACHIEVEMENT, not achievement. Finally...a methodology that attempts to level the playing field by considering the income of students enrolled in our schools. What a breath of fresh air!!!
Thu, 02/05/2015 - 10:29am
I have three comments on the methodology for these rankings. (1) This methodology should not be over-sold as reflecting school district or school quality. All it represents is whether a district does better or worse than would be expected based on a relatively crude measure of socioeconomic status, a district's free and reduced price lunch percentage. I think the temptation is to interpret what remains as being due to the quality of the district or school. But probably it is still the case that a sizable proportion, maybe the majority, of the remaining variation in school district or school test scores is NOT due to district or school quality, but rather is due to other factors outside the district's control. To take one example, two districts could have the SAME FRL percent, but the FRL eligible kids in one district could on average have much lower incomes than the other district. (2) I think all analyses of these data by Bridge and others should prominently mention the sample size issue. Especially at the school level, but even at the district level, we would expect that smaller schools or districts would be more likely to show large discrepancies in performance below or above the regression line. Some of the remaining variation in test scores is simply statistical noise. Readers should be very cautious about interpreting numbers that are substantilaly above or below the regression line (what is predicted based on FRL percent) for small schools or districts. Bridge should produce standard errors for some of these estimates. (3) I think it would be better to use the state standardized scale scores in the analysis rather than the pass rate. The pass rate is a noisier measure than the standardized scale score. In fact the pass rate for each student is simply a zero one variable for whether the student is above or below some cut-score in their standardized scale score. Relatively small changes in student performance can cause dramatic changes in the pass rate. I would urge Bridge in next year's analysis to consider using ALL the information contained in the standardized scale scores rather than to artificially throw out some information by using the pass rate. This is also a problem in virtually all discussion of Michigan students' test scores -- everyone uses the pass rate, which is noisier than mean scaled scores.
Chuck Jordan
Sun, 02/08/2015 - 12:08pm
Tim, could you explain state standardized scale scores. I wonder if Bridge is also considering numbers of students in single parent families. That might make a difference comparing poor rural districts and urban districts. It would also be interesting to consider average class sizes (with and without mixed ability ranges) and average teacher salaries. That might make it easier to learn what makes some schools better than others. However, while all of this is interesting, comparing schools and districts using any standardized test measures, does nothing to do what assessment needs to be doing which is to focus on the student to improve the teaching methods and styles that would improve student learning. But hey, this is fun.
Mon, 02/09/2015 - 10:36am
When students take the state tests, they get a certain number of questions right, a certain number wrong. For each year, the state then translates the percentage right into a "standardized test score", based on the relative difficulty of the test that year. These standardized test scores are comparable over time for a given grade and subject -- that is, a student who got a certain standardized test score on the 2014 version of the test would be expected to get the same standardized test score if they had taken the 2009 version of the test, at least on average. (In practice, student performance varies over time due to any number of factors. ) The percentage "passing" the MEAP is simply the percentage of students for that grade and subject who have a standardized test score that exceeds some cut-off. When we use the percentage "passing" the MEAP as our measure of student achievement, we are throwing away a huge amount of information that is contained in the standardized test score. We are throwing away information that is contained in how far or how below the cut score each student is, for example. Focusing on the cut score puts a huge weight on whether a school district is moving students from just below to just above the cut score, even though such small changes in student achievement have no real meaning. Focusing on the cut score also puts no weight on large improvements in student achievement among students who are either well below or well above the cut score, even though such improvements in student achievement are more likely to be associated with dramatic improvements in life prospects. The mean standardized test score for all students in a school, a school district, or a state, has the merit in placing an equal weight on the student achievement of all students in each of these groupings. Any improvement in student achievement by any student will affect the mean. In contrast, the percentage above the cut-score is only sensitive to achievement gains of students close to the cut score. From a statistical point of view, the standard error in estimation of measuring the student achievement of a group will be smaller in measuring the mean standardized test score than measuring the mean percentage passing a test. You need larger sample sizes to get accurate measures of the population mean for a group. Almost all media coverage of student test scores in the state of Michigan over-emphasizes trends in the percentage of students "passing" the MEAP. We would have a better debate about student achievement if instead we focused more on improvements in mean standardized test scores. We could also discuss improvements in different percentiles of the test score distribution, but let's start with mean test scores and median test scores, and go on from there to discuss how student achievement is changing at different percentiles of the Michigan student test score distribution.
Chuck Jordan
Wed, 02/11/2015 - 7:47pm
Thank-you Tim. What do you think of A B C D F grades for schools?
Thu, 02/05/2015 - 11:31am
How does a district with low single digit or even 0% proficiency in certain subject areas receive an 'Exceeds' score? Are income levels that high of a weighting in the formula?
John
Sat, 02/07/2015 - 6:00pm
The problem I see is that school districts are going to use this “honor” to try to avoid the discussion of the overall failure of the school district. This is the mass text I received from our high school principal. Mr. Bacon: Great news! Pellston High School received Academic State Champ honors and is ranked 23rd among high schools in the state! A letter home will soon follow. Congratulations to our students, families and staff! Because I could not believe that our failing district would receive such an “honor”, I followed the link and discovered the criteria.
Allen Cumings
Mon, 02/09/2015 - 8:05am
Has Bridge Magazine considered looking at schools who over the past 3 years have shown positive academic proficiency? Many schools in the State have worked hard to improve instructional practice and curriculum to grow student learning. Many of these schools are showing consistent growth; however, in this report only a 3 year averages of proficiency is calculated. For example, your number 4 district in 3rd grade was 44% in math in 2011, 30% in 2012 and is now 22.7% (average 32.3%- with a high poverty, they score high). Reading in 3rd grade was 80% in 2011, 69.5% in 2012 and 36.4% in 2013 (61.9 average- high score when calculated with high poverty rates). The negative trend for this school continues in Math in 4th and 5th grade; however, reading score trends are more positive. Congrats to them for their hard work (note the elementary does not rank in the top 10). Trend data needs to be considered in reviewing schools. Proficiency is the lagging indicator to growth; therefore, you could be leading people to think a school is outperforming others based on a 3 year average. Since other considerations, such as percent of economically disadvantaged is used in the formula, you may want to consider an alternative list that identifies schools who have demonstrated consistent, and positive academic proficiency trends for a 3 year period. Based on Phil Power's article, if I were looking to purchase a house and was deciding a location based on the schools in the area, academic proficiency trends (or lack there of) would be an important indicator to base my decision on.
John Nash
Mon, 02/09/2015 - 10:17am
In one breath we say all students can learn. We can find examples at any point in history of people who were either poor, or determined as slow who have beaten all the odds and become some of our best thinkers. I understand why so many researchers use free and reduced lunch figures but what about those very kids, are they to go through life thinking there was no way they could succeed educationally or worse still everyone expects them to fail. How surprising is it that the school districts where the U of M profs and the MSU profs children attend do so well. I also have a problem with using test scores as the only determinates of academic success. Again, history would tell us there are many, many exceptions to that way of thinking. I appreciate your efforts. The best education for every child is what we all have to be striving for, but basing so much on income level and test results is some what hollow in my mind.
s.melvin
Mon, 02/09/2015 - 12:04pm
NO child has a INCOME..most children have fathers/mothers are in the MILITARY and Workers on MINIUMWAGES ... Lets show our true Coloar and UP the minmumWages to $ 15.00 and bring the children out of proverty and into "better" schoolingNOW)! Also insure that MOTHERS that HOMESCHOOL there children GET a SALARY of $ 2000..a months.. To have a better school system: install 20X 20 TV in each classroom and have college TEACH into that classroom (so all children have THE BEST TEACHER in each school and ready to go to collegs at age 16(my grandchildren DID HOMESCHOOLED)! Best school in michigan INTERNATIONL school in Bloomfield. Reportcards : used the same sytem as the Voters machine ..fill in the questio/answers..Any computer genies in the HOUSE?
Ellen Cannon
Tue, 07/07/2015 - 4:26pm
Coming into this late but would be very curious as to how data on students with disabilities was included. Was it included? Many students receiving special education services still take MEAP. Some take MI-Access. So assuming that this data was included as long as they took MEAP. Curious if you have a breakdown of student ratio with disabilities. You have look at poverty and so just wondering if there were any similar comparisons or considerations to look at data around disability?