Save Our Schools has called for the My School website to be scrapped in a 125-page, five-part submission to the Senate Education Committee inquiry into reporting school literacy and numeracy test results.
The submission carefully analyses each of the Government’s arguments for publishing school results in the light of available research evidence. It finds that none of the arguments withstand scrutiny.
Publication of school results and league tables do not improve student achievement
Part 1 of the submission analyses the Government’s claim that publishing school results will contribute to improving student achievement by increased pressure on schools from parents demanding better performance and through competition generated by comparing school results. A review of the major research studies shows that the claim is not substantiated. At best, the evidence is mixed. This is not a robust foundation for education policy.
Studies of the impact of publishing school results only show no significant effect on student achievement.
The evidence from the major studies of “high stakes” accountability measures (which include reporting school results, rewards and sanctions for schools, exams for students and grade promotion standards) is mixed. Some find positive effects on student achievement, others find no effect or negative results. The most frequently cited studies showing a positive effect from “high stakes” accountability over-state the effect because of technical flaws in the measurement of the effect.
Much of the positive effect of school accountability measures shown in key studies is due to student-based requirements such as meeting grade promotion standards, passing end-of-course exams and graduation exams. The effect of these requirements on student achievement is larger than the effect of rewards and sanctions relating to school performance.
There is substantial evidence that the positive results in some studies may be due, at least in part, to schools manipulating their results in various ways such as re-classifying students as special education students so they can be exempted from high stakes tests, suspending low achieving students during the testing cycle and outright cheating.
The major academic reviews of “high stakes” accountability studies conclude that the evidence is mixed and provides little scientific foundation for these policies. This has been acknowledged by the chief executive of the Australian Curriculum, Assessment and Reporting Authority, Peter Hill, thus contradicting his Minister’s claims.
In addition, the weight of evidence from the best designed and most comprehensive of studies on increasing choice and competition between schools is that it does not improve student achievement once student background characteristics are taken into account. This is the conclusion of many reviews of school choice and competition studies.
Publishing school results and league tables harm education
Part 2 of the submission shows that publishing school results and league tables harm education. Overseas experience shows that it narrows the curriculum by reducing time for non-tested subjects such as science, history, social studies, arts and music, and physical education. It also narrows the curriculum because more emphasis is given to the areas that are most conducive to testing by multiple-choice questions and there is less teaching of more complex skills.
Teaching is distorted because schools and teachers tend to respond to pressure created by publication of school results and league tables by focusing more on teaching test taking skills and practicing for tests. It also reduces collaboration between schools and between teachers within schools and it makes it harder for low achieving schools to attract and retain high quality teachers.
Schools tend to concentrate on improving the results of students who are on the border of accepted benchmarks at the expense of both high and low achieving students. Publication of school results also unfairly stigmatises and humiliates some students and alienates them from schooling.
Publication of school results and league tables tends to increase socio-economic and ethnic segregation between schools which exacerbates inequity in education. Student learning needs in some schools increase without proportionate increases in resources to meet those needs while increasing concentrations of students from low socio-economic status families in some schools tend to lead to lower overall outcomes.
The submission shows that many of these effects are already apparent in Australia.
Published school results are misleading and unreliable
Part 3 analyses the Government’s claim that publishing school results on the My School website will better inform parent choice of school. It shows that parents can be misled by using published school results to inform their choice of school because school results are not a reliable measure of school quality.
Published school results are an inaccurate and misleading measure of school quality because differences in school composition and many other factors outside the control of schools affect school results, including student absenteeism, student turnover and the proportion of students receiving private tutoring. In addition, they are a selective measure of education and they are subject to manipulation and rorting. There are also significant statistical errors on school test results, especially in the case of smaller schools.
These factors also make it difficult to identify effective school practices. Decision-makers and schools may be misled in recommending and adopting particular educational programs. Education practices and programs could be falsely identified as successes while successful programs in reality are ignored or even falsely condemned.
Like school comparisons are no answer
Part 4 of the submission analyses the Government’s claim that so-called “like school” comparisons provide contextual information that obviates the harm of simplistic comparisons of school results and league tables. It demonstrates that My School makes misleading and unreliable comparisons of so-called “like schools” because its measure of “like schools” is flawed and omits many factors outside the control of schools which affect test results. It does not consistently compare like with like.
In particular, the Index of Community Socio-Educational Advantage (ICSEA) which is used to measure the socio-economic status (SES) of schools is flawed. It attributes each student with the average SES of the area in which they live rather than the actual SES of their family. This leads to misclassifications of students because high and low income families often live in the same areas.
As a result, the comparisons of “like schools” systematically and unfairly favour private schools over government schools. The average SES of private schools is artificially lowered by ICSEA while the average SES of government schools is artificially inflated because high income families choose private schools at double the rate of low income families. This leads to comparisons of unlike schools rather than like schools. There are also many other flaws in the measure of so-called “like schools”.
Other government claims for publishing school results are defective
Part 5 of the submission addresses other Government claims. It argues that reporting of school results is not necessary to identify struggling schools and those in need of intervention programs and additional resources as claimed. This information has long been available to education departments and schools. Governments have failed to provide the necessary resources and support for struggling schools
A further argument used to justify publication of school results is that parents and the public have a right to this information. This is also a flawed argument because it does not address the ‘public interest’ criteria for publishing government information.
There is no absolute right for information as is recognised in the case of some court hearings, national security issues and Cabinet meeting minutes. While there should be a strong presumption in favour of releasing information about public institutions and others supported by government funding, these decisions should have regard to the public benefit versus public harm. In the case of school results, the evidence is that it brings no significant gains, but has significant negative effects on education and school communities while being subject to manipulation and rorting.
The right to information is a very important principle in a democracy. It is critical to keeping governments and government agencies accountable. However, there are some circumstances in which the provision of information can do greater harm than good. Reporting school results, and the inevitable league tables that follow, is one among many such circumstances.
There is an alternative
The imbalance in the harm and benefit of publishing school results demands a re-consideration by Australian governments. The SOS submission recommends that the My School site should be abandoned because of the harm it will do to education and because of the fundamental difficulty of comparing schools in a meaningful, reliable and useful way.
There is an alternative way of providing greater accountability by education systems and governments for school results. This can be done without publishing the results of individual schools. It can be done by publishing the number of schools whose average score falls within different test score ranges. This information could be summarised as a histogram for each strand and Year level for both literacy and numeracy.
The full SOS submission (no. 262) is available on the Senate Education Committee website