Military intelligence?

Many news outlets, including my local paper, recently carried an AP story about a report issued by The Education Trust.  In the report, we learn that one out of every four people who take the U.S. military’s entrance exam fail.  The report and article use these findings to indict the education system in the United States.  Unfortunately, it is more of an indictment of the authors.  While the Armed Services Vocational Aptitude Battery (ASVAB) is required at “hundreds” of high schools, it is by no means ubiquitous.  The sample, then, is not random, but largely self-selecting.  Consider also that the ASVAB is not required for officer-track students (i.e. service academies and ROTC), but only for enlisted personnel.  When I first read the article, I immediately realized that the conclusion wasn’t justified.

It wasn’t until I did some further research that I realized exactly how wrong the authors were.  As it turns out, ASVAB scores are given as percentiles.  In other words, to get into the Army, you need not get 31% of the questions correct, you need to score better than 31% of the other test takers.  This means that the military automatically rejects the lowest scores, no matter how good or bad they may be on an absolute scale.  The military grants waivers for low scores in certain situations, which is why only a quarter of test takers fail.

So the news here is that 25% of students fail an exam designed for them to fail.  In other news, water is wet.  On second thought, maybe this is an indictment of the education system, but not in the way suggested.  An elementary understanding of statistics immediately calls into question the credibility of the study.  One paragraph of a Wikipedia article ruins the starting point of the article.  The education system may have flaws, but the only flaws exposed by this article are the lack of statistical understanding and simple research ability possessed by The Education Trust and AP writers Christine Armario and Dorie Turner.

The secret to a successful school

Disclaimer: In the year or more since this blog started, I’ve made a concerted effort to avoid political discussion.  I have political opinions, some of them rather strong, but there are plenty of other places on the Internet where one can find barely-knowledgeable idiots ranting on about politics.  I’ve got other things that I’d rather talk about with my ones of readers.  With that in mind, today’s post isn’t intended to be a discussion of the political aspects of school policy, but just a look at what I consider to be interesting numbers.  You can draw whatever conclusions you like from it.

I am a member of the local newspaper’s community advisory board.  Once a month, the self-selected group sits down with the Executive Editor and the Managing Editor and we discuss various topics that help keep the newsroom connected with the community.  A few months ago, as the state legislature was negotiating the budget this year, the topic turned to education.  I knew that anything “for the childrens!” was likely to involve emotion and drama from all sides of the argument.  Arming myself with factual information would not only help me discuss the matter logically, but would give me enough to decide what my opinion even was.

What I did was not a rigorous analysis, it only took me an hour and involved only a few bits of data.  Using the state’s website, I found various statistics on public school districts in Lafayette and the surrounding areas.  The first step is defining what success is.  Success needs to be quantifiable to be useful, but for some reason, the state does not have a metric labeled “success.”  As proxies for the elusive “success” number, I used graduation rate, the percentage of graduates who go to college, and the pass rate for the ISTEP+ exam.

For the contributors to success, I tried to anticipate what would be commonly argued.  Since cutting school funding is a political sin, I looked at the dollars spent per student.  The teacher-to-student ratio is often used to indicate the quality of a particular school, so that data was added in.  Conservatives may argue that the school systems are over-burdened with administrators so I looked at the administrator-to-student ratio.  Liberals might suggest that poor and minority students are set up to fail, so I took a look at the percentage of minority enrollment, and use the percentage of students receiving free or reduced lunches as a proxy for income.  Having forgotten most of what I learned in my “Elementary Statistical Methods” class, I couldn’t do any impressive analysis.  What I did instead was to plot each factor against each measure of success.

Dollars per student

For the school districts I examined, the range of total spending per student ranged from $8,100 to $12,500.  It is interesting to note that there was statistically no effect of spending on the graduation rate or the ISTEP+ pass rate.  Spending and college enrollment rate were weakly related, but the relationship was negative.  That is to say that the more money spent per student, the smaller the percentage who went on to college.  It is important to note, of course, that correlation does not imply causation.  From the data, we cannot tell if spending more per student is likely to decrease those going to college, or if fewer students going to college means a district gets more funding to try to improve that metric.  Either way, you can’t tell how successful a school is by how much money it spends per student.

Teachers per student

The argument often put forth is that small class sizes lead to more individual attention, which allows each child to learn better.  That makes sense.  From my friends and relatives in education, I can say with confidence that larger class sizes hasten teacher frustration.  However, the data suggests that the educational success of a school district is improved by having fewer teachers per student.  Once again, two of the three pairs were meaningless — ISTEP+ and graduation were not statistically linked to the number of teachers.  The collegre rate did show a very weak relationship, but as the number of teachers increased, the percentage of students going to college dropped.  This makes sense in light of the spending, since having more teachers results in a higher cost.

Administrators per student

An increased number of administrators also brings a higher cost, but with arguably less benefit.  The numbers show that there is no benefit, at least as far as our “success” metrics are concerned, to having more administrators per student.  No doubt there are arguments both for and against having a higher number of administrators per student, and either can lead to different successful outcomes, none of which are what we’ve looked at.

Minority enrollment

It was not clear how the Indiana Department of Education defines “minority.”  As a result, it makes coming to conclusions based on the data a bit more difficult.  Fortunately, for ISTEP+ and college attendance, there’s no statistically significant relationship, so there’s no conclusions to make.  There is a weak relationship suggesting that as minority enrollment increases, the percentage of students graduating high school decreases.

Family income

I saved family income for last, because it alone had truly meaningful results.  As I said earlier, income data for each school district was not readily available.  Instead, I had to use the percentage of students on free or reduced-price lunch assistance as a proxy.  The higher the percentage, the poorer the district.  The range for this metric is from 14% (West Lafayette) to 66% (Lafayette and Frankfort).  It is interesting to note that Lafayette and Frankfort schools also have the highest percentage of minorities.  There’s only a weak relationship indicating poorer students are less likely to go to college, perhaps in part because of Indiana’s 21st Century Scholars program.  However, there’s a moderately strong relationship to suggest that wealthier students are more likely to pass ISTEP+ and to graduate high school.

So what is the secret to a successful school?  Don’t have poor students.  As I said above, this is not a rigorous analysis, but it is notable that our income proxy is the only factor that affected the success metrics picked.  I won’t speculate on an explanation.  Here’s some R-squared values for those who are stat geeks:

$/student teachers/student admin/student minority income
grad rate 0.00029 0.01131 0.01415 0.39579 0.68394
ISTEP+ pass rate 0.00705 0.06211 0.02814 0.06409 0.79488
college 0.34121 0.18075 0.07707 0.01112 0.3498