Pyne Misleads on School Autonomy Results

Christopher Pyne has used highly selective and misleading evidence to support his claim that greater school autonomy for independent public schools will improve school outcomes. He ignores overwhelming national and international evidence that indicates his project to make government schools more like private schools as a way to improve outcomes is doomed to fail.

Pyne claims that Independent Public Schools have been a “clear success” in Western Australia and that the OECD, the World Bank, the Productivity Commission and the Gonski reports all supported greater school autonomy as a way to create improved outcomes for all students. All this is highly misleading as most of his claims are false.

First, the only substantive research on the impact of IPS in Western Australia shows that they have not increased school results. The conclusions of an evaluation by the Melbourne Graduate School of Education [2013] provide compelling evidence that there has been no improvement in student outcomes:

In this early phase of the IPS development there is little evidence of changes to student outcomes such as enrolment or student achievement. [8]

….there was no evidence of substantial differences in outcomes between schools that were selected into IPS and those that were not. [9]

Analysis of the secondary data shows that IPS were generally high-performing before transition, and there has been no substantive increase in student achievement after becoming IPS. [36]

Similarly to student achievement data, analysis of available data on student enrolment and behaviour across all public schools showed no change for IPS. There were pre-existing differences in attendance rates between IPS and other public schools which remained unchanged over the three years of implementation. There were increases in enrolment for the first intake of IPS, but lesser increases for subsequent intakes. There were no differences in suspension, exclusion or retention rates between IPS and other public schools. IPS had lower numbers of moderate and severe students at attendance risk compared to other public schools, both prior to and after becoming IPS. [38-39]

As yet, there is no strong data indicating that the IPS initiative has significantly changed the ways that public schools engage with their communities.
….the secondary data shows no substantial change in staffing, student behaviour, attendance or performance between IPS and other public schools. [56]

….the IPS initiative has yet to realise changes in student achievement or attendance at school. [66]

At this stage of the IPS development there has been no evidence to indicate changes in enrolments or student achievement…
….the evaluation suggests that there has been limited change in outcomes for students, including achievement, enrolment, attendance, and exclusions and suspensions. [72]

….there were significant and pervasive concerns expressed by survey respondents and interviewees from all stakeholder categories that extending the IPS initiative to selected schools created a ‘two-tiered’ education system, to the disadvantage of schools that were not IPS. [67]

World renowned education academic, Professor John Hattie, who assisted in the WA evaluation told the ABC’s FactCheck recently that before joining the initiative the schools involved had better academic results than other public schools, but they made “no improvement under the “independent’ model”.

Second, OECD research has found that greater school autonomy in curriculum and assessment tends to improve outcomes, but greater school autonomy in budgeting and staffing has little to no effect. Increasing school autonomy in Australia is focussed on greater autonomy over budgeting and staffing decisions while control over curriculum and assessment is becoming more centralised. Independent Public Schools are granted greater control over staffing and budgeting decisions, but not over curriculum and assessment.

The results from the OECD’s 2009 Programme for International Student Assessment (PISA) show that “in countries where schools have greater autonomy over what is taught and how students are assessed, students tend to perform better” and “the prevalence of schools’ autonomy to define and elaborate their curricula and assessments relates positively to the performance of school systems” [OECD 2010: 14, 41].

However, the report continues:

While there is a clear relationship between the degree of curricular autonomy a school system offers its schools and the system’s performance, this relationship is less clear when the degree of autonomy in allocating resources is analysed through measures such as: selecting teachers for hire, dismissing teachers, establishing teachers’ starting salaries, determining teachers’ salary increases, formulating the school budget, and deciding on budget allocations within the school. [41]

The study found that in the vast majority of 64 countries participating in PISA 2009, including in Australia, there was no relationship between student achievement in schools and the degree of autonomy in hiring teachers and over the school budget. The OECD concluded emphatically that ”…greater responsibility in managing resources appears to be unrelated to a school system’s overall student performance” and that “…school autonomy in resource allocation is not related to performance at the system level [41, 86 (note 7)].

The one qualification is that “…where schools are held to account for their results through posting achievement data publicly, schools that enjoy greater autonomy in resource allocation tend to do better than those with less autonomy” [14]. However, the impact is trivial. Students in higher autonomy schools achieve only 2.6 points higher in reading on the PISA scale than those in an average autonomy school. To put this in perspective, increased learning over the school year amounts to an average of about 35-40 points on the PISA scale. This is hardly compelling evidence.

The national report on Australia’s PISA 2009 results also shows virtually no difference in the correlation between school autonomy and student achievement in NSW, with the lowest degree of autonomy of any jurisdiction, and Victoria which has a high degree of autonomy [Thomson 2010: 274]. Moreover, it found no significant relationship between student performance and school autonomy in budgeting and staffing in any school sector – government, Catholic or Independent – even though government schools overall have significantly less autonomy than Independent schools.

Third, Pyne’s citation of support by the World Bank is also highly misleading. A recent World Bank review of research studies says that there is no convincing evidence of the effects of school autonomy in Australia, New Zealand and the UK on student achievement [Bruns et. al.: 11]. The review focuses on studies of school autonomy in developing countries and notes that there are few rigorous studies available and that the evidence on impact on student test scores is mixed [12, 103, 106, 131].

Fourth, Pyne’s citation of support by the Productivity Commission ignores its conclusion that the evidence on school autonomy is mixed. The Commission’s report on the schools workforce noted that past studies “have found mixed impacts from delegating decision-making to schools” [Productivity Commission, 2012: 246].

Pyne’s final piece of evidence is that the Gonski report supported greater school autonomy. This is correct. However, the report failed to do a comprehensive review of the evidence and cited only one study [Woessmann 2007]based on a cross-country study using 2003 international test data.

The reliability of such cross-country studies is questionable. One of the authors of the paper cited by the Gonski report has subsequently admitted, with other colleagues, that there are potential pitfalls associated with this type of study because it is extremely difficult to disentangle various national policy, institutional and cultural factors influencing education outcomes from the impact of school autonomy [Hanushek 2011: 3, 5]. The authors state that “imperfect measurement of specific institutions lead us to be cautious in the interpretation” of the results [24-25].

The findings of these types of studies are likely to be affected by a host of unmeasured country-specific factors which could influence the magnitude and even the direction of an observed relationship between achievement and school-based characteristics, such as the extent of school autonomy [Hamilton 2009: 10]. For these reasons, many researchers prefer to focus on longitudinal analysis of specific countries or regions to examine the effects of policy initiatives on school results.

Pyne has failed to sustain his case that more school autonomy will improve student results. His selective and misleading quotations ignore overwhelming national and international evidence that it has little to no impact. Other Australian reports also acknowledge the lack of evidence that school autonomy increases student outcomes [Jensen 2013; Victorian Competition and Efficiency Commission 2013]. The research evidence from New Zealand’s decentralized schools, US charter schools, Sweden’s free schools, and England’s academy schools shows no clear evidence that increased school autonomy leads to increased student achievement [Save Our Schools 2013].

Pyne’s claims about the success of independent public schools are even countered by his own colleagues. For example, a bi-partisan report by the Senate Education Committee earlier this year stated that “…it is unclear whether school autonomy ultimately improves student outcomes.” [Senate 2013: 47]. The NSW Minister for Education, Adrian Piccoli, says: “We will not be introducing charter schools or independent public schools because there is no evidence that they improve student performance” [ Sydney Morning Herald, 20 July 2013].

All this shows that Pyne has got it drastically wrong on school autonomy. There is no compelling evidence that it will improve student results. Promoting independent public schools is completely ill-conceived and misplaced.

As Ken Boston has advised:

Stick to the knitting, Mr. Pyne: as shown in Britain, school governance is a peripheral distraction with no real bearing on student outcomes. [ The Australian, 1 November 2013]

Trevor Cobbold

Bruns, Barbara; Deon Filmer & Harry A. Patrinos 2011. Making Schools Work: New Evidence on Accountability Reforms, The World Bank, Washington DC, February.

Hamilton, Laura. S. 2009. Using PISA Data to Measure School and System Characteristics and Their Relationships with Student Performance in Science, Paper presented to National Centre for Education Statistics research conference on the Programme for International Student Assessment: What We Can Learn From PISA, Washington DC, 2 June.

Hanushek, Eric; Susanne Link & Ludger Woessman 2011. Does School Autonomy Make Sense Everywhere? Panel Estimates from PISA, Working Paper No. 17591, National Bureau of Economic Research, Cambridge, Mass., November.

Jensen, B.; Ben Weidmann & Joanna Farmer 2013, The Myth of Markets in School Education, Grattan Institute, Melbourne, July.

Melbourne Graduate School of Education 2013, Evaluation of the Independent Public Schools Initiative, Final Report, Perth, May.

OECD 2010. PISA 2009 Results: What Makes a School Successful? – Resources, Policies and Practices (Volume IV), Paris.

Productivity Commission 2012, Schools Workforce, Research Report, Canberra, April.

Save Our Schools 2013. School Autonomy Fails to Increase Student Achievement and Undermines Collaboration Between Schools, Submission to Senate Education Committee, January.

Senate Education, Employment and Workplace Relations References Committee 2013, Teaching and Learning – Maximising Our Investment in Australian Schools, Parliament House, Canberra, May.

Thomson, Sue; Lisa De Bortoli; Marina Nicholas; Kylie Hillman & Sarah Buckley 2010. Challenges for Australian Education: Results from PISA 2009, Australian Council for Educational Research, Melbourne.

Woessmann, Ludger; Elke Ludemann; Gabriela Schutz & Martin West 2007. School Accountability, Autonomy, Choice, and the Level of Student Achievement: International Evidence from PISA 2003, OECD Education Working Paper, No. 13, Paris, December.

Previous Next

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.