Advertisement
Canada markets closed
  • S&P/TSX

    21,885.38
    +11.66 (+0.05%)
     
  • S&P 500

    5,048.42
    -23.21 (-0.46%)
     
  • DOW

    38,085.80
    -375.12 (-0.98%)
     
  • CAD/USD

    0.7323
    -0.0000 (-0.00%)
     
  • CRUDE OIL

    83.85
    +0.28 (+0.34%)
     
  • Bitcoin CAD

    87,987.06
    +211.80 (+0.24%)
     
  • CMC Crypto 200

    1,391.48
    +8.90 (+0.64%)
     
  • GOLD FUTURES

    2,346.30
    +3.80 (+0.16%)
     
  • RUSSELL 2000

    1,981.12
    -14.31 (-0.72%)
     
  • 10-Yr Bond

    4.7060
    +0.0540 (+1.16%)
     
  • NASDAQ futures

    17,764.50
    +197.00 (+1.12%)
     
  • VOLATILITY

    15.37
    -0.60 (-3.76%)
     
  • FTSE

    8,078.86
    +38.48 (+0.48%)
     
  • NIKKEI 225

    37,780.35
    +151.87 (+0.40%)
     
  • CAD/EUR

    0.6826
    +0.0005 (+0.07%)
     

Why Schools' Ranks Changed in the 2014 Edition of Best High Schools

Parents and high school officials are often curious why public schools moved up and down in the U.S. News Best High Schools rankings when compared with previous years. When examining the newly released 2014 edition, there are several possible answers to this frequently asked question.

1. Changes in relative performance on state tests: Some schools that were ranked in the 2013 Best High Schools rankings fell off the new 2014 list completely because they are no longer among the best-performing schools on their statewide tests -- meaning that their overall student performance on state tests during the 2011-2012 academic year did not exceed statistical expectations (Step 1 of the rankings methodology) or the performance of their least advantaged students was not as good as the state average (Step 2 of the methodology).

Without successfully passing both of these steps of the methodology, schools were not eligible for the national competition for a gold, silver or bronze medal and don't appear at all in the rankings.

In total, 3,022, or 63 percent, of the high schools that were awarded a gold, silver or bronze medal in the 2013 Best High Schools rankings returned to the 2014 rankings as a gold, silver or bronze medal winner. That means that 37 percent of the high schools that were ranked in 2013 fell out of the rankings in 2014.

ADVERTISEMENT

Of the schools that were gold medal winners in the 2013 rankings, 84 percent returned to the 2014 rankings as a gold, silver or bronze medal winner. A large majority of the 2013 gold medal winners -- 74 percent -- returned as gold in 2014.

Of the schools that were silver medal winners in the 2013 rankings, 67 percent returned to the 2014 rankings as a gold, silver or bronze medal winner. More than half of the 2013 silver medal winners -- 53 percent -- returned as silver in 2014.

And of the schools that were bronze medal winners in the 2013 rankings, 56 percent returned to the 2014 rankings as a gold, silver or bronze medal winner. More than half of the 2013 bronze medal winners -- 53 percent -- returned as bronze in 2014.

These results show that the bronze high schools were much less consistent in their year-to-year performance on statewide tests, especially when compared with the relatively high year-to-year consistency among the gold schools, and to a somewhat lesser degree, the silver medal winners.

2. Changes in relative or absolute performance on college-level course work: Some ranked schools may have moved either up or down in the 2014 rankings compared with last year due to how their 12th-grade class in 2011-2012 participated in and performed on Advanced Placement or International Baccalaureate exams compared with the 2010-2011 class.

U.S. News determines the college readiness of each school by analyzing the performance and participation of 12th-graders from the graduating class cohort in the most recent academic year -- in this case, the 2011-2012 school year. This means we looked at whether or not these students took and passed any AP or IB exams during their years at the school, up to and including their senior year.

Many schools experienced a change in their status, ranging from moving a few places in the gold medal rankings to changing medal status -- from gold to silver, silver to bronze, bronze to gold or bronze to silver -- due to changes in the level of a school's College Readiness Index.

3. New medal winners: Some schools were new to the 2014 rankings because they passed both Step 1 and Step 2 of this year's methodology but didn't pass both of those steps in 2013 and therefore weren't eligible for a gold, silver or bronze medal.

Other high schools became eligible to be ranked for the first time in 2014 since they were relatively new schools and had their first 12th-grade class graduate in 2011-2012, or their graduating class was now large enough to be included in the rankings.

In total, 1,685, or 36 percent, of the high schools that were awarded a gold, silver or bronze medal in the 2014 rankings were not ranked in 2013. Specifically, 70 of this year's gold medal winners, 442 of the silver medal winners and 1,173 of the bronze medal winners were not ranked in 2013.

4. Suppression of state assessment results or incomplete state assessment data: Some top-performing high schools in terms of college readiness that were medal winners in 2013 weren't eligible to be ranked in 2014 because their state blocked out certain portions of their state assessment math and English results from being released publicly.

There were also schools that weren't ranked in 2013 that may have been eligible for medals this year, but certain portions of their state assessment data were suppressed or missing.

Data could have been suppressed by states for various reasons, including protecting the identification of certain students or students in particular subgroups, and a state deciding to hide data that it determined "unreportable" publicly for other reasons.

It is possible that in some of these cases the data were redacted because a high percentage of students in the school achieved the same standard -- for example, more than 90 percent of students scored above proficient -- and the data were suppressed from public view.

Data suppression meant there was not enough state assessment data available to calculate a performance index for those schools as part of the 2014 rankings. The performance index is the key component needed to pass Step 1 of the rankings methodology. Therefore, schools without a performance index did not pass Step 1, which in turn meant they were not eligible for medal consideration.

Out of the 19,411 high schools eligible for the 2014 rankings, 1,798, or approximately 9 percent, did not have enough data to calculate a performance index and were therefore removed from the analysis. The following states had a higher percentage of schools that did not have a performance index: Delaware, Kansas, Maine, Montana, Nebraska, North Carolina, Nevada and Oklahoma.