An article in the Sydney Morning Herald this week by columnist, Paul Sheehan, demonstrates why school league table rankings are highly inaccurate and misleading. Sheehan compared the rankings of government and private schools in the eastern suburbs of Sydney and concluded that most private schools produce clearly better results. He claims that private schools produce a better education than public schools even when they have similar resources and similar socio-economic catchment areas.
However, he failed to take account of significant differences in the socio-economic status (SES) composition of schools and the statistical margin of error on the NAPLAN test results used to rank schools. He even compares some public and private schools with large differences in resources.
Sheehan first compared the rankings of Marcellin College, a Catholic boys school, with Randwick Boys High and found that Marcellin ranked 336 places above Randwick Boys on the Herald’s league table. Randwick has a significantly lower socio-economic index (ICSEA) rating than Marcellin – 1054 compared with 1084.
The socio-economic composition of the schools is very different. Randwick Boys has over double the proportion of students from the lowest SES quartile as Marcellin Boys – 25% compared to 10%. In addition, it has a much lower proportion of high SES students – 25% compared to 36%. Despite its much higher proportion of students from the lowest SES quartile, Randwick only has the same level of gross income per student as Marcellin – $12,797 and $12,543.
Sheehan also compares the results of Randwick Boys with those of Waverley College, which is ranked nearly 300 places above. This is an even more misleading comparison. Waverley has an ICSEA rating of 1144, 90 points above Randwick. Randwick Boys has five times the proportion of students in the lowest SES quartile as Waverley College – 25% compared to 5%. It has 42% of its students in the lowest two quartiles compared to only 16% at Waverley. Waverley has 46% of its students in the top SES quartile compared 25% at Randwick Boys. In addition, Waverley has a gross income of $18,363 per student compared to $12,797 at Randwick Boys, that is, 43% more per student.
There can be little wonder then at the difference in results when there are such stark differences in the socio-economic composition of these schools, especially when Randwick Boys receives only the same funding or less funding than the two private schools.
Sheehan then goes on to compare Randwick Girls High with St. Catherine’s, an Anglican girls school and finds that St. Catherine’s ranks at 52 on the Herald’s league table compared to 231 for Randwick Girls. He wrongly asserts that the two schools have a similar socio-economic composition.
Sheehan shows complete ignorance about the ICSEA scale. He says that St. Catherine’s ICSEA rating of 1183 is only 10% above that of Randwick Girls at 1071. He fails to understand that the median score is set at 1000. ICSEA values range from 500 to 1300, but the vast number of schools are in the range 750 – 1200.
There is a huge difference in the socio-economic composition of St. Catherine’s and Randwick Girls. Sixty-five per cent of St. Catherine’s students are from the highest SES quartile and only 5% are from the lowest quartile compared to 34% and 17% respectively at Randwick Girls. St. Catherine’s also has double the funding per student of Randwick Girls.
Once again, there can be little wonder at the difference in the results of the two schools given the difference in their socio-economic composition.
Sheehan also compares the rankings of Randwick Girls High with Brigidine College, a Catholic girls school. The schools have a similar ICSEA rating (1071 and 1082), a similar proportion of students from the lowest SES quartile (34 and 30% – and similar levels of funding ($11,485 and $11,980 per student). Sheehan says that Brigidine is the much better school because it ranks more than 100 places above Randwick Girls on the Herald’s league table.
However, Sheehan makes an elementary mistake in not looking at the statistical margins of error on the results of the two schools. The error margins reported by My School display the extent of uncertainty associated with school results.
All tests incur random errors or chance variation in student results which cause differences in school results that do not reflect differences in actual performance. For example, the same students may achieve different results on the same test on different days because of differences in their own well-being, such as lack of sleep or food, or because of variations in external factors such as how cold or hot conditions are in the room in which the tests are conducted.
Take the Year 9 results in the schools compared by Sheehan. The average scores for Brigidine are slightly above those of Randwick Girls. However, the error margins for the two schools overlap in all tests except writing. This means that the results of the schools are statistically similar and cannot be distinguished.
Sheehan also makes the same conclusion from a comparison of St. Clare’s College, another Catholic girls school, and Randwick Girls. Once again, the story is the same. The error margins overlap in all tests except writing and, therefore, there is no difference in results. It is also to be noted that the ICSEA rating of St. Clare’s at 1103 is significantly higher than that of Randwick Girls and St. Clare’s has about one-third more funding per student than Randwick Girls. So, Randwick Girls is doing very well in comparison.
The failure to take account of the margins of error on school results produces quite misleading and inaccurate league table rankings. Statistically similar results between Brigidine, St. Clare’s and Randwick Girls have been converted into huge artificial differences on the Herald’s league table rankings.
It should be noted that the Herald’s league table is a composite of Year 7 and 9 results, so the Year 7 results should also be considered. The Year 7 results of Randwick Girls and St. Clare’s are statistically similar but those of Brigidine are above the error margins of Randwick Girls, indicating actual higher results.
However, using Year 7 results to compare the quality of secondary schools is highly dubious as this is the first year of secondary school and the tests are conducted just three months after the beginning of the school year. Different results in Year 7 are more likely to reflect differences in intake.
The failure to take proper account of differences in socio-economic composition and error margins on school results demonstrates the fallibility of school league tables. They tell us little about school quality and the comparisons made by Sheehan do not justify his conclusion that private schools produce better results than public schools.
A far more reliable comparison of private and public school results is provided by the OECD’s Programme for International Student Assessments (PISA). The 2009 PISA results for Australia show that private schools do no better than public schools after accounting for differences in socio-economic composition. The report states:
Once differences in students’ socioeconomic background were taken into account there were no longer any statistically significant differences in the average reading, mathematical and scientific literacy scores of students from the different school sectors. [p.ix]
Clearly then, Sheehan has got it very badly wrong. League tables are not the way to compare school quality.