A Melbourne University study has debunked some common myths about the relative performance of Australia’s public and private schools. It shows that the decline in Australia’s performance in international tests over the last decade is due to falling results in private schools, the falls being similar in both Independent and Catholic schools. It also shows that the decline has occurred across the whole range of student achievement and is not confined to high achieving students as some claim.
Australia’s average reading, mathematics and science results in the OECD’s Programme for International Student Assessments (PISA) were higher than the OECD averages for 2000 and 2009. However, there was a fall in reading and mathematics scores between 2000 and 2009 of 13 and 19 points respectively, the latter decline amounting to about half a year’s learning.
The study by the Melbourne Institute for Applied Economic and Social Research investigated a range of student and school factors that may have contributed to the declines. Student characteristics included the educational and occupational background of their parents, gender, whether born overseas, language background, and the grade students were in when they undertook the PISA tests.
The study found that changes in student characteristics during this period would have been expected to lead to increased test scores rather than a decline. It therefore concluded that the results can “be treated as being consistent with a decline in the performance of Australian schools”.
The study analysed a range of school factors including the state and sector of schools, school size, their location in cities or regional areas, whether they are single-sex schools, and the average characteristics of their student body such as the average occupational SES of parents, and the distribution of students across grade levels. It found that the falls in school performance were concentrated among private schools and not associated with any other characteristics of schools. It states:
The main finding of note is that Catholic and Independent tended to be more efficient in converting students with given scores into high literacy scores than did public schools in 2003, but these differentials disappeared in the following two cohorts, so schools in these sectors were worse performers in the later cohorts.
At the school level, the declines in performance of schools have not been associated with many of their observed characteristics, other than that the declines appear to have been concentrated among private schools. Where private schools once generated better outcomes than public schools, given the compositions of their student bodies, this was not the case after 2003.
The study noted that the decline in performance in Independent and Catholic schools occurred despite substantial increases in government funding over the period. These funding increases greatly exceeded those for government schools whose results appear not to have fallen. This suggests that private schools have used their funding increases much more ineffectively than government schools, raising the question of what benefit the nation’s taxpayers have received from this expenditure.
The findings of the study also contradict claims that the decline in PISA results is attributed largely to a reduction in the proportion of students performing at the highest proficiency levels. It has been argued that the number of high achievers is shrinking because resources are being diverted to weaker students.
However, results have declined for both high and low achieving students. For example, the decline in reading performance for students at the 5th and 10th percentiles between 2003 and 2009 was 9.5 and 10.6 points respectively compared to 5.1 and 6.1 at the 90th and 95th percentiles. The decline in mathematics for students at the 5th and 10th percentiles was 7.7 and 6.1 points compared to 10.8 at both the 90th and 95th percentiles.
In general, the decline in reading between 2003 and 2009 appears to have been slightly higher in the bottom half of the achievement distribution than it was in the top half, while the reverse is apparent for the decline in mathematics, where the decline appears lower in the bottom half of the distribution than it was in the top half.
The declines were also similar across different socio-economic status (SES) backgrounds. For example, the average reading score for students in the lowest SES quintile (20%) declined by 13 points between 2003 and 2009 compared to a decline of 11 points for students in the highest SES quintile. The average mathematics score for students in the lowest quintile declined by 8 points compared to 12 points for students in the top quintile.
The study is to be published in a forthcoming issue of the academic journal Economics of Education Review