Study Shows Schools Rorting Test Results

My School has turned the NAPLAN tests into “high stakes” tests. They are high stakes because school reputations are now made and broken on the results posted on the My School website and in league tables of school results published in newspapers. The careers of school principals and teachers are now also affected by school NAPLAN results.

Money now also hinges on school results. The Federal Government makes reward payments to state governments on the basis of their NAPLAN results and plans to make reward payments to schools and teachers for improving test results. The Victorian and Queensland Governments intend to introduce performance pay for teachers based partly on NAPLAN results.

When tests involve “high stakes” they are prone to rorting and manipulation. Schools have strong incentives to artificially manipulate the results to their benefit. Overseas experience demonstrates this is a practical reality in high stakes regimes.

Over the past twenty years in the United States, state and federal education policies have tried to hold schools more accountable by tying rewards and sanctions to test scores. Several academic studies have shown that these policies may induce schools to “game the system” instead of making genuine educational improvements.

A new study to be published by the Federal Reserve Board of New York of a high profile school accountability program in Florida has found evidence of rorting by schools. It shows that schools were classifying low-performing students into exempt categories that are excluded from the school performance grading so as to artificially raise their test scores.

The study analysed the responses of public schools to the Florida Opportunity Scholarship Program, an influential school accountability policy. Under the policy, schools are rated on their performance in standardised tests. Failure exposes a school to a voucher system allowing its students to leave and attend either a better-performing public school, or a private school.

Using data from the Florida Department of Education, the study compared the classifications of students in schools that barely avoided an F (fail) grade with classifications in schools that narrowly received an F grade.

It found that the accountability program led to increased classification of students into the exempt Limited English Proficiency (LEP) category in the high-stakes grade 4 and in grade 3, the entry grade for that high-stakes year. The threatened schools increased their classification of exempt LEP students in grade 4 by an average of 53 per cent and by 55 per cent in grade 3.

We find robust evidence that the threatened schools classified a greater percentage of their students into the excluded LEP category in high-stakes grade 4 and entry grade 3. [p. 22]

It found no evidence of increased classification of students into the exempt special education category for students with disabilities. It also found no evidence of differential shifts in the classification of black, Hispanic, Asian, American Indian, multiracial, free or reduced-price lunch students after the introduction of the program.

The study also analysed the implications of its findings for New York City’s Progress Reports program which were modelled on the Florida program but contain crucial design differences. Two critical differences are that the New York policy does not include a voucher to enrol at another public or private school and that it includes the test scores of all English language learners and special education students in the computation of school grades. Therefore, there is less incentive to re-classify students to exempt categories.

The authors noted that the New York City program rules can generate other adverse incentives for re-classification. Schools there graded as failing can earn additional credit for demonstrating progress of English language learners and special education students and this could be an incentive to classify higher-performing students in these categories to artificially boost scores. This is an issue for further research.

The authors said that their findings offer lessons for the design of school accountability programs elsewhere.

The general lesson to take from examining the Florida and New York accountability policies is that policymakers must be careful when designing exemptions, special allowances, or credits for certain groups of students since these accommodations can create adverse incentives and unintended consequences. While accountability policies must acknowledge the challenges schools face in educating students with limited English proficiency, disabilities, and other special needs, excluding them entirely from accountability measures may induce struggling schools to reclassify low-performing students into exempted categories. The danger is that such an approach can lead to strategic sorting rather than genuine improvements to the quality of education for the students whom the programs aimed to help. [p.22]

In Australia, exemptions from the NAPLAN tests may be granted to students with intellectual disabilities, have recently arrived in Australia and are from a non-English speaking background. Exempt students are not included in the calculation of mean scores for schools, so an incentive exists for schools to classify low performing students as exempt in order to lift the school scores.

There is little evidence of such practices in Australia so far, but the incentives will increase as reputations, careers and salaries come to depend more and more on NAPLAN results. As a Wall Street Journal article on the new study warned that “…anywhere where testing and statistics become the guiding forces in how something is judged, cheating and misrepresentation can follow.” It is a lesson that governments in Australia should heed before they increase the stakes associated with NAPLAN results.

Trevor Cobbold

Rajashri Chakrabarti & Noah Schwartz, Unintended Consequences of School Accountability Policies: Evidence from Florida and Implications for New York, FRBNY Economic Policy Review (forthcoming).

Previous Next

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.