School performance reporting and league tables create pressures and incentives for schools to fudge their results.
Competition for higher rankings forces schools to “play the system” to show improvement even where there is none. Playing the system is the quick route to better results. It is a feature of school performance reporting wherever it has been adopted.
A whole range of devices are used, including poaching high achieving students from other schools, denying entry to low achieving students, suspending low achieving students on test days, encouraging low achievers to take courses not used to rank schools, helping students with answers, changing answers and devoting more time to rote learning and testing skills.
It is now all beginning in Australia.
The West Australian reported on 13 July that WA schools are pushing many Year 12 students to choose easier subjects so they can avoid exams and schools can lift their ranking on league tables. If struggling students are funnelled into courses with no exams, their scores will not be counted in a school’s overall tertiary entrance results which are used to measure school performance.
The WA Curriculum Council chief executive, David Wood, said he was concerned that Year 12 enrolment data showed many students had opted for new courses at the easiest level, which did not involve exams.
“I had expected this would be more prevalent in schools from lower socio-economic areas, but indeed it is schools from across the socio-economic spectrum, including some high-profile independent schools,” he said.
This could be the tip of the iceberg. Last year, the Sydney Morning Herald (7 June) reported that a Sydney private school was forcing students to complete their HSC at TAFE if it appears they will not score high marks. Parents said their children had not been allowed to sit their exams at the school. The school had rapidly improved its ranking in the Herald’s HSC results league table in recent years.
Such practices have been a feature of the English school system over the past decade and many school systems in the United States under the pressure to maintain or improve school rankings.
Intensive test preparation at the expense of time spent on non-tested subjects is another way commonly used to boost school results and which is already becoming too common in Australia. School systems around the country are pushing schools to devote more time to test practice at the expense of science, history, the arts and music and physical education.
Last April, the head of the Victorian Department of Education, Peter Dawkins, sent a memo to all principals suggesting more time be spent on preparing students for the National Assessment Programme for Literacy and Numeracy (NAPLAN) tests so as to improve Victoria’s results.
The Age reported that teachers were being pressured to put more time into test practice [13 April 2009]. The Courier-Mail in Brisbane reported in March that education officials were putting tremendous pressure on teachers to lift results by practising for tests. The West Australian reported that up to a quarter of school time was being spent on preparing for the tests [14 April 2009].
Not only does teaching to the test come at the expense of non-tested subjects, but it also replaces teaching the more complex thinking and writing skills, which are central to quality education.
The future for Australia can also be seen in recent US newspaper reports on how “fails” are turned into “passes” in order to meet school performance requirements under the No Child Left Behind legislation.
The absurd lengths to which schools are forced into go to fudge their results reached new heights in Texas last week. The Dallas Morning News reported on July 5 that the Texas Education Agency had found a way for schools to report students as passing tests even when they fail.
As a result, hundreds of schools are expected to be able to report higher results when school ratings are announced later this month.
New rules to apply from this month allow schools to count students who failed the Texas Assessment of Knowledge and Skills (TAKS) as passing, as long as a complex formula shows that those kids are predicted to pass in a future year.
Under the new arrangements, if a student fails the Year 7 math TAKS, the school can use a statistical formula developed by the Texas Education Agency to predict whether that student will pass the math test in Year 8.
The formula considers the student’s math and reading TAKS scores, plus the average math TAKS score at the school. If the student is predicted to pass, the school gets to count her/him as actually passing in Year 7 even though she/he really failed.
The new system is foolproof against failure. A school never has to go back and compare the predicted performance with the actual performance in Year 8.
A school can record a Year 7 student who failed the TAKS as a pass if the student is projected to pass in Year 8, but it is not penalized if that student does not pass in Year 8 as predicted. Instead, the model looks ahead again to predict whether the student will pass the Year 11 TAKS.
The Dallas Morning News said it is a “get out of jail free” card for schools:
“So many schools are likely to benefit from this latest academic ‘get out of jail free’ card that it raises the question: At what point do the ratings become meaningless?”
State education officials said that the new school rating reports will make it clear what the pure TAKS passing rates are, compared to those boosted by the new projection model.
School performance rating reports were introduced in Texas in 1993 to better inform parents about school progress, but numerous changes since then have made it more and more difficult to gauge actual progress. It is a lesson in itself about how school reporting inevitably becomes corrupted as officials and schools find new ways to demonstrate that student achievement is improving. As the Dallas Morning News commented:
…over the years, the state has made so many changes that it is a test in itself to figure out if a school is doing better, doing worse or holding even….. While the promise was for clarity, the current system is based on formulas that few understand.
Another way of turning fails into passes to improve school rankings is outright cheating. Cheating by helping students with answers or changing answers has become endemic in the US and England since school performance reports and tables were introduced. Texas, South Carolina, Mississippi, Virginia and California have all had problems in the last couple of years with cheating on tests used to measure school performance.
Yet another case occurred recently in Georgia where it was revealed that student answers on state tests had been changed in four elementary schools (Education Week, 9 July.)
As a result, one school principal has resigned and an assistant principal fired. Both have been charged with tampering with state documents, which is a felony.
Last week, the Georgia State Board of Education tossed out the results of the four schools which means that the schools no longer meet federal standards and are not eligible for federal funding.
The West Australian, Texas and Georgia cases are the latest in the accumulation of examples of how schools rig their results under the pressure to improve or maintain their ranking on school performance tables. It is an example of a well known phenomenon in social science research called Campbell’s law. Campbell’s law states:
“The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor”.
Far from improving transparency and school results, school performance reporting leads to manipulation of school results, greater opaqueness and complexity. It misleads rather than informs. It fails to achieve real improvement in student outcomes at school.
Trevor Cobbold