The following is the text of a new Education Policy Brief by Trevor Cobbold published by Save Our Schools.
The Federal Minister for Education, Simon Birmingham was quick to pounce on the PISA 2015 results published in early December to put another knife in the Gonski funding plan. He took the opportunity to repeat his highly misleading claim that school funding increases don’t improve school outcomes. His oft-repeated claim serves one purpose only – to justify his Government’s refusal to fully fund Gonski.
Birmingham dismissed funding as a factor in school outcomes because he says that Federal funding has increased by 50 per cent since 2003 while the PISA results have declined [Belot 2016, Munro & Bagshaw 2016]. However, he vastly exaggerated the actual increase in funding which was only very small and largely misdirected to schools least in need of additional funding; he ignored significant some improvements in Year 12 outcomes that are in sharp contrast to the PISA results; and ignored several recent academic studies showing that increased funding does improve school outcomes, especially for students from disadvantaged backgrounds.
The funding increase was only small and largely went to private schools
Birmingham’s claim of a 50 per cent increase in school funding since 2003 is far from the truth:
• It refers to total funding and not per student funding;
• It refers to nominal funding and not inflation adjusted funding;
• It ignores cuts in state government funding of public schools;
• It does not distinguish between funding increases for public and private schools; and
• It ignores changes in the composition of enrolments.
The actual increase in total government funding (Commonwealth and state/territory) per student, adjusted for inflation, for the nine years from 2004-05 to 2013-14 was only 4.5 per cent, some eleven times less than the Minister’s claim [Chart 1]. This increase amounts to an increase of only 0.5% per year. The increase in dollar terms was a mere $472 per student for the whole period, or a miniscule $52 a year. Not surprisingly, this has had little impact on school outcomes.
State/territory governments, which account for over 80 per cent of public school funding, have cut funding of public schools while increasing funding of private schools. State/territory governments have taken the opportunity of an increase in Commonwealth Government funding for public schools of $744 per student to cut their own funding of public schools by $348 per student [Chart 2]. In effect, they cut the Commonwealth increase by nearly half. In contrast, they increased private school funding by $135 per student to supplement the Commonwealth increase of $700 per student.
The picture is even worse because the large part of the small increase in total funding per student favoured private schools who enrol only a small proportion of disadvantaged students. Total government funding per student in private schools increased by three times more than for public schools – 9.8 per cent compared to only 3.3 per cent. In dollar terms, funding for private schools increased by $835 per student compared to $385 per public school student. That is, the most disadvantaged school sector got an increase of $43 per student per year compared to $93 per student per year for private schools.
The PISA results show that low achievement is concentrated amongst low SES, Indigenous and remote area students. The large majority of these disadvantaged students attend public schools. In 2014, 82 per cent of students from low SES families, 84 per cent of Indigenous students, 79 per cent of remote area students and 87 per cent of very remote area students were enrolled in public schools. Despite higher need in public schools, the biggest increases in funding went to private schools.
Birmingham’s claims also ignored changes in the composition of enrolments. Indigenous students, disability students and senior secondary school students attract significantly higher funding per student than average and have increased as a proportion of all students. They increased by three percentage points from 23 to 26 per cent between 2003 and 2014. The increase in public schools was 3.4 per cent compared to 2.2 per cent in private schools. The increase in the percentage of these students in public schools could well have accounted for the very small increase in funding per student in public schools.
Improvement in Year 12 outcomes contrasts with the PISA decline
While the declining PISA results are a major concern, Birmingham’s criticism of the lack of responsiveness of student outcomes to increased funding ignores some significant improvements in school outcomes over the period of the small funding increase. For example, there were significant improvements in Year 12 outcomes which are in sharp contrast with the declining PISA results.
There was a large increase in the proportion of young adults (20-24 year-olds) attaining a Year 12 or equivalent outcomes since 2001. In 2016, 90 per cent of young people attained Year 12 or Certificate II, up from 79 per cent in 2001, while 89 per cent attained Year 12 or Certificate III compared to 77 per cent in 2001 [ABS 2016b, Tables 31 & 32; ABS 2011].
It is also notable that the proportion of 25-34 year-olds in Australia who have attained an upper secondary education increased from only 68 per cent in 2000, when it was the 5th lowest in the OECD, to 88 per cent in 2015 [Chart 3]. The increase of 20 percentage points was the largest in the OECD except for Portugal and Turkey.
The apparent retention rate to Year 12 and the Year 12 completion rate are additional ways to measure the outcome of school education. The average retention rate from Year 7/8 to Year 12 increased from 67 per cent in 2000 to 84 per cent in 2015 [Chart 4]. The retention rate in public schools increased by 15 percentage points from 70 per cent to 82 per cent and increased for Catholic schools from 79 to 84 per cent. In Independent schools, it fell from 97 to 92 per cent. Indigenous retention rates increased from 36 to 59 per cent.
Year 12 completion rates have also increased. The rate for all students increased from 69 per cent in 2003 to 72 per cent in 2014. The completion rate for low SES students increased from 64 to 67 per cent, but fell for high SES students from 78 to 76 per cent [Productivity Commission 2005, Table 3A.40; Productivity Commission 2016a, Table 4A.124]. Despite this improvement, however, a large proportion of students still do not complete Year 12.
The percentage of the estimated Year 12 population achieving an ATAR score of 50 or above has increased significantly in recent years from 38 per cent in 2007 to 42 per cent in 2015 [Chart 5, earlier figures are not available]. The percentage increased in all states/territories except Queensland, with a large increase in the ACT and significant increases in NSW and South Australia.
The contrast between the declining PISA results for 15 year-old students (largely Year 10 students) and the improvement in Year 12 results is a puzzle that warrants further analysis. It may partly reflect a difference in student attitudes to the PISA tests, which have no personal consequences attached to them, and the Year 12 assessments which have a major influence on the future paths that students take after leaving school.
The one thing in common between the PISA results and Year 12 outcomes is huge achievement gaps between disadvantaged and advantaged students. Improving the results of disadvantaged students is the major challenge facing Australian education.
Many studies show that increased funding improves school results
Yet, Birmingham continues to wilfully ignore the extensive research evidence demonstrating that increasing funding for disadvantaged students is critical to improving outcomes. Five major studies published in the last year alone show that increased funding improves results, especially for disadvantaged students. For example, an extensive review of studies by an academic expert on education finance at Rutgers University in New Jersey shows strong evidence of a positive relationship between school funding and student achievement and that particular school resources that cost money have a positive influence on student results [Baker 2016]. It concludes:
The growing political consensus that money doesn’t matter stands in sharp contrast to the substantial body of empirical research that has accumulated over time… [p. 2]
The available evidence leaves little doubt: Sufficient financial resources are a necessary underlying condition for providing quality education. [p. 20]
A study published in the Quarterly Journal of Economics found that a ten per cent increase in per-student spending each year for all 12 years of public school for low income students extends their schooling by nearly half a year, increases their adult earnings by nearly ten per cent and family income by 16 per cent, and reduces their annual incidence of adult poverty by six percentage points [Jackson et.al. 2016]. The study found that the positive effects are driven, at least in part, by some combination of reductions in class size, having more adults per student in schools, increases in instructional time, and increases in teacher salary that may have helped attract and retain a more highly qualified teaching workforce. The authors concluded that their results:
…. highlight how improved access to school resources can profoundly shape the life outcomes of economically disadvantaged children, and thereby significantly reduce the intergenerational transmission of poverty. [p. 212]
A study published by the US National Bureau of Economic Research found that school finance reforms in the United States that increased funding for low income school districts improved the results of students in those districts [Lafortune et. al. 2016]. It also found that the increased funding reduced achievement gaps between high and low income school districts. The authors concluded that “marginal increases in school resources in low-income, poorly resourced school districts are cost effective from a social perspective…” [p. 7]. Further, “Our results thus show that money can and does matter in education…” [p. 35]
Another study found that increased spending following court-ordered school finance reforms in the United States increased high school graduation rates in high-poverty districts [Candelaria & Shores 2016]. High poverty school districts in states that had their finance regimes overthrown by court order experienced an increase in school spending by four to 12 per cent and an increase in high school graduation rates by five to eight percentage points seven years following reform.
In addition, a study soon to published in the academic journal, Economic Policy, on the long-run effects of school spending on educational attainment following school finance reform in Michigan found that increases in school expenditure improve the later life outcomes of students [Hyman 2017]. Students who gained a ten per cent increase in school funding were seven per cent more likely to enrol in college and eleven per cent more likely to receive a post-secondary degree.
An OECD report on how to improve results for low performing students found that the incidence of low performance in mathematics is lower in countries where educational resources are distributed more equitably between socio-economically disadvantaged and advantaged schools. It concluded:
The evidence presented in this report suggests that all countries and economies can reduce their share of low-performing students, and that a reduction can be accomplished in a relatively short time. The first step for policy makers is to prioritise tackling low performance in their education policy agendas, and translate this priority into additional resources. [OECD 2016b, p.190]
The OECD has also highlighted a key message from PISA 2015:
In countries and economies where more resources are allocated to disadvantaged schools than advantaged schools, overall student performance in science is somewhat higher… [OECD 2016c, p. 189]
These studies show that targeting funding increases to disadvantaged schools and students is fundamental to improving student achievement and reducing achievement gaps between the advantaged and disadvantaged. Inadequate funding is likely to be a factor behind the failure to improve the results of disadvantaged students and reduce the large achievement gaps between them and high SES students. Past funding increases have been very small and were not directed primarily to disadvantaged students. Needs-based funding in Australia, especially for low SES students, has only ever been a very small proportion of total school funding as demonstrated by a research report prepared for the Gonski review [Rorris et.al. 2011]. As David Gonski said in response to criticism of his plan that increased funding has failed to improve outcomes:
…the essence of what we contended, and still do, was that the way monies are applied is the important driver. Increasing money where it counts is vital. The monies distributed over the 12-year period to which the commission refers were not applied on a needs based aspirational system. [Gonski 2014]
If there is any credibility to Birmingham’s criticism of past funding increases failing to improve results, it is in relation to private schools. As shown above, funding per private school student, adjusted for inflation, increased by ten per cent between 2004-05 and 2013-13 but student performance fell in both Catholic and Independent schools. It suggests that private schools did not use their larger funding increases efficiently.
Federal and state education ministers are due to meet in coming months to decide future school funding arrangements. State education ministers should not be misled by Birmingham’s false claims about school funding and outcomes. All the evidence shows that increased funding for disadvantaged students is critical to improving school outcomes.
The national education ministers’ council should support the full implementation of the Gonski plan. It should resist the Federal Government’s proposal to cut education funding further by reducing funding indexation rates.
Australian Bureau of Statistics 2011,Education and Work, Australia – Additional Data Cubes, May 2011, Catalogue no. 6227.0.55.003, Canberra.
Australian Bureau of Statistics 2015, Schools Australia 2014, Catalogue No. 4221.0 Canberra.
Australian Bureau of Statistics 2016a, Schools Australia 2015 , Catalogue No. 4221.0, Canberra.
Australian Bureau of Statistics 2016b, Education and Work, Australia, May 2016, Catalogue No. 6227.0, Canberra
Baker, Bruce 2016, Does Money Matter in Education. Albert Shanker Institute, Washington DC.
Belot, Henry 2016, Education Minister responds to damning report, says he’s open to importing specialist teachers, ABC News, 7 December.
Candelaria, Chris & Shores, Ken 2016, The Sensitivity of Causal Estimates from Court-Ordered Finance Reform on Spending and Graduation Rates, CEPA Working Paper No. 16-05, Centre for Education Policy Analysis, Stanford University, Stanford, CA.
Cobbold, Trevor 2014, Money Matters in Education, Education Research Brief, Save Our Schools, July.
Cobbold, Trevor 2016, Productivity Commission Fails to Lift the Bonnet on its Own Funding Figures, Education Policy Brief, Save Our Schools, September.
Council of Australian Governments (COAG) 2009, Communique, 30 April, Hobart.
Gonski, David 2014, Jean Blackburn Oration, University of Melbourne, 21 May.
Hyman, Joshua 2017, Does money matter in the long run? The effects of school spending on educational attainment, Economic Policy (forthcoming).
Jackson, C. Kirabo; Rucker, Johnson C. & Persico, Claudia 2016, The effects of school spending on educational and economic outcomes: Evidence from school finance reforms, Quarterly Journal of Economics, 131 (1): 157-218.
Lafortune, Julien; Rothstein, Jesse & Schanzenbach, Diane Whitmore 2016, School Finance Reform and the Distribution of Student Achievement, National Bureau of Economic Research, NBER Working Paper No. 22011, February, Cambridge, Mass.
OECD 2016a, Education at a Glance 2016: OECD Indicators, OECD Publishing, Paris.
OECD 2016b, Low-Performing Students: Why They Fall Behind and How to Help Them Succeed, OECD Publishing, Paris.
OECD 2016c, PISA 2015 Results (Volume II): Policies and Practices for Successful Schools, PISA, OECD Publishing, Paris.
Productivity Commission 2005, Report on Government Services, Canberra.
Productivity Commission 2016a, Report on Government Services, Canberra.
Productivity Commission 2016b, Overcoming Indigenous Disadvantage: Key Indicators 2016, Canberra.
Rorris, Adam; Weldon, Paul; Beavis, Adrian; McKenzie, Phillip; Bramich, Meredith & Deery, Alana 2011, Assessment of Current Process for Targeting of Schools Funding to Disadvantaged Students, Australian Council for Educational Research, Camberwell.Charts on School Funding and Outcomes
[pdf-embedder url=”https://saveourschools.com.au/wp-content/uploads/2018/06/Birmingham-is-Wrong-Again-About-School-Funding-and-Outcomes.pdf” title=”Birmingham is Wrong Again About