Australian education ministers have been forced to concede that security for the NAPLAN tests is inadequate now that they are “high stakes” tests. In effect, ministers have admitted that this year’s test results have been partially corrupted by cheating and rorting. It means that comparisons of results to be published on the My School website later in the year will be misleading in many cases.
A meeting of the national education ministers’ council last week requested that the Australian Curriculum, Assessment and Reporting Authority to provide advice on ways to strengthen security and test administration protocols. The Federal Education Minister, Julia Gillard, said that security measures would be made tougher.
The decision to upgrade NAPLAN security follows several cheating incidents during the last round of tests held in May. They included opening test booklets early so as to better prepare students, teachers helping students with answers and having posters on classroom walls to help students during the tests.
In devising tougher security measures, ministers would do well to have regard to the experience overseas with cheating. In the twenty years or more since school tests results have been published in England and the United States, no government has been able to stamp out cheating. Indeed, it appears to be increasing rather than diminishing.
Last week, the New York Times [10 June] reported that cheating by teachers and principals was being investigated in Georgia, Indiana, Massachusetts, Nevada, Virginia and other states. Earlier this year, one of the largest cheating scandals ever in the United States was revealed involving nearly 400 schools in the state of Georgia.
Last year a survey of public school teachers in Chicago by the Chicago Sun-Times and the teachers’ union revealed that one-third of all teachers had been pressured in the last year by principals and boards to change student grades. Twenty per cent said that they had actually raised grades under this pressure.
Experts say that cheating is increasing as the stakes over standardized testing ratchet higher, including, most recently, taking student progress on tests into consideration in teachers’ performance reviews. A data forensics expert told the New York Times [10 June] that cheating was on the rise in the US. “Every time you increase the stakes associated with any testing program, you get more cheating,” he said.
Others say that what is revealed is just the tip of the iceberg. For example, one expert, Tom Haladyna, professor emeritus at Arizona State University, told the Atlanta Journal-Constitution following other incidents of cheating in Georgia last year that: “It’s just the tip of the iceberg, I think. The other 80 percent is being hidden” [21 June 2009].
Education departments across the US are now spending millions of dollars in trying to monitor and deter cheating designed to bolster school results. Several states have hired test security companies to do audits of tests to check for cheating.
The most common technique is to do erasure analysis by computer scanning to identify higher than average numbers of changes to answers. However, strategic cheating can often occur without triggering statistical and other devices used to identify potential cheating.
Some states have adopted other approaches. For example, some school districts randomly rotate teachers to different classrooms during testing so that teachers do not supervise tests for their own students. However, this too is open to abuse and doesn’t protect from other ways in which cheating can occur.
Despite the changes made to improve security of tests in the US, cheating incidents continue. Only last year, a report by the US Government Accountability Office identified many gaps in the security of tests and found that many US states were still not able to ensure reliable and valid assessments.
Independent supervision of the tests is probably the only way to significantly improve the security of NAPLAN. The problem for ministers is that it would incur substantial additional costs and administrative difficulties.
There are over 9000 schools in Australia, most of which have at least two classes doing the NAPLAN tests. Thus, independent supervision could involve hiring more than 18 000 temporary staff for the week of the NAPLAN tests and cost upwards of an extra $4 million. Then there are the logistical problems of ensuring that it is only the independent supervisors who open the sealed packages of test booklets and seal up the packages of completed booklets for dispatch to NAPLAN markers.
This would be a massive undertaking and one which ministers are unlikely to take on. Yet, without this, cheating will grow and often remain undetected. The validity and reliability of the school results published on My School will be even more in question than they are now.
The only way cheating and rorting on NAPLAN will completely disappear is if My School is dumped. My School has such implications for school reputations and the careers of teachers and principals that it causes some to resort to cheating. No one was interested in cheating and rorting of the national literacy and numeracy tests when there were no stakes attached to them. For as long as NAPLAN has high stakes attached to it the cheating and rorting will continue and My School will continue to mislead the public and parents about school performance.