Everywhere that governments publish school results and permit league tables to be published, schools and schools systems manipulate their results to look better. Published school results are a form of modern-day alchemy which only deceive.
An example of the absurd lengths that school systems will go to fudge their results comes from Texas. New rules introduced last year allow schools to count students who failed the Texas Assessment of Knowledge and Skills (TAKS) as passing, as long as a complex formula shows that those students are predicted to pass in a future year.
Under new arrangements, if a student fails the Year 7 math TAKS, the school can use a statistical formula developed by the Texas Education Agency to predict whether that student will pass the math test in Year 8. The formula considers the student’s math and reading TAKS scores, plus the average math TAKS score at the school. If the student is predicted to pass, the school gets to count her/him as actually passing in Year 7 even though she/he really failed.
The new system is foolproof against failure. A school never has to go back and compare the predicted performance with the actual performance in Year 8. A school can record a Year 7 student who failed the TAKS as a pass if the student is projected to pass in Year 8, but it is not penalized if that student does not pass in Year 8 as predicted. Instead, the model looks ahead again to predict whether the student will pass the Year 11 TAKS.
The result has been that hundreds of schools received a higher rating on the TAKS. Last week, the Houston Chronicle [20 June] reported that the number of “exemplary” schools, the highest rating, more than doubled from 1,000 in 2008 to 2,158 in 2009. Without the statistical projections that some failing students would later pass, the increase would have been only 44 schools. At the other end of the spectrum, the number of “unacceptable” schools increased by 43, from 202 to 245. But without the use of the statistical projections, the real increase was 401, or almost 10 times the adjusted number.
In New York, officials secretly cut the pass score for English and mathematics tests so as to increase student results and show improvement. The New York state education department dropped cut scores on the state tests from 2006 (the year that annual testing in grades 3-8 was introduced) to 2009. In 2006, a student in 7th grade could achieve “proficiency” by getting 59.6 percent of the points correct on the state math test; by 2009, a student in the same grade needed only 44 percent of the available points.
With such chicanry it is little wonder that Diane Ravitch, Professor of Education at New York University and author to the best-seller, The Death and Life of the Great American School System, says that New York’s accountability system is a form of “institutionalised lying” which produces “rigged and fraudulent” results.
Already we see the beginnings of such manipulation of school results in Australia in only six months since the My School website was launched. Last month’s NAPLAN tests featured numerous incidents of schools encouraging low achieving students to stay home on test days, allegations of teachers helping studentss with answers, changing student answers on tests, leaking questions before the tests.
State education officials told principals to lift their school results “all all costs”, threatened them with the sack if their results did not improve and directed them to prepare students with intensive test practice beforehand.
Fraud and manipulation is inevitable in a school reporting system which puts the careers or teachers and principals and the reputations of schools on the line. It can only grow and as it does the school results published on My School will become more and more unreliable as an indicator of school quality. Parents who rely on My School to choose a school are likely to be misled.