At the beginning of the month of September, Newsweek made a ranking of “Top High Schools” in the United States.
Stupidest statistical reading that any group of any people have ever done.
First off, there is a serious issue with how they even begin to approach and narrow down the schools which they will be looking at. For the sake of keeping the original tone and message of the piece, I’ll pull exactly what they said about this so called “threshold” analysis: “First, we created a high school achievement index based on performance indicators.”
What do you mean by “performance indicators,” exactly? What do these editors consider a performance indicator to be? Is it some list of proficiency rates on standardized assessments?
Actually, let me quote that. “i.e., proficiency rates on standardized assessments.” Let me quote another thing. Something pertaining to how the threshold only applies to “high schools that perform at or above the 80th percentile within each state.”
Now let me ask something: how bias towards standardized testing can you be in order to make your job easier?
Standardized testing doesn’t prove anything, and ergo makes this threshold completely arbitrary. It is statistically impossible for a school of, say Akins size, to score in the same percentile as a school such as LASA, which is nearly a third of the size. This threshold serves only one purpose: make the Newsweek staff’s job easier.
Oh, and secondly, if you thought the threshold was stupid, take a look at the “Ranking” Analysis: “We created a College Readiness Score based on the following six indicators:
o Enrollment Rate – 25 percent
o Graduation Rate – 20 percent
o Weighted AP/IB composite – 17.5 percent
o Weighted SAT/ACT composite – 17.5 percent
o Holding Power (change in student enrollment between ninth and 12th grades; this measure is intended to control for student attrition) – 10 percent
o Counselor-to-Student Ratio – 10 percent”
Literally, the the entirety of the “Top Schools” are schools with a small population and students that are concerned with their academic success. Rather, we can assume that the “Top Schools” in the U.S. are something that is more commonly know as a Magnet School. This explains why LASA is one of the “Top Schools” and Akins wasn’t even able to compete on the same footing for such an esteemed label.
So when you hear that magnet schools are supposedly the best in the nation, just know that these rankings are designed to sell magazines and for the convenience of editors instead of accurately analyzing what school is more effective at educating students.