The My School website purports to enable comparisons of school results amongst socio-educationally similar schools – so-called ‘like schools’. However, these comparisons are unreliable because the measure of like schools is flawed and it fails to take account of many factors outside the control of schools which affect test results. It does not consistently compare like with like.
There are three basic flaws in the so-called like school comparisons on My School which create much potential for schools to be wrongly and unfairly compared with schools that are significantly different.
First, the Index of Community Socio-Educational Advantage used to determine like schools is an inaccurate measure of the socio-economic status (SES) of schools. In particular, it is systematically biased in favour of private schools in comparison with ‘like’ government schools.
Second, differences in school composition according to gender, ethnic sub-groups and students with disabilities, all of which influence school results, are ignored.
Third, My School also ignores many other factors outside the control of schools which influence school results. These include differences in funding, student mobility between schools, school size, student selection and private tutoring of students.
ICSEA is an inaccurate measure of the socio-economic status of schools
ISCEA is an inaccurate measure of the school SES for several reasons
A major flaw is that My School’s measure of the socio-economic status (SES) of schools is systematically biased in favour of private schools. The Index of Community Socio-Educational Advantage (ICSEA) makes the results of private schools look better than “like” government schools by comparing higher SES private schools with lower SES government schools. This occurs because ICSEA systematically under-estimates the SES of private schools and over-estimates the SES of government schools.
There are two sources of this bias. First, ICSEA is based on the average socio-economic characteristics of the areas in which students live and not on the actual SES of their families. Studies by the Australian Bureau of Statistics show that some high income families live in low SES areas and vice versa, so the actual SES of some students will be above the area-average and others below the area-average.
ICSEA also fails to allow for differences in the proportion of high and low SES families that enrol in private and government schools. On average, 47% of high income families choose private schools compared to 24% of low income families. In the case of secondary schools, 55% of high income families choose private schools compared to 26% of low income families.
The greater leakage of high SES students from each area into private schools causes the ICSEA rating of private schools to under-estimate their actual SES because these students are classified according to their (lower) area SES measure rather than by their (higher) family SES. It also causes an over-estimate of the SES of government schools. Government schools take a greater proportion of low SES students, but these students are classified at the (higher) area SES rating rather than at the actual SES of their families.
This mismeasurement of school SES generates comparisons between higher SES private schools and lower SES government schools which favour private schools because higher SES students tend to have higher average results than lower SES students.
This systematic bias in favour of private schools could be compounded if the relatively high non-response rates to Census questions on key ICSEA variables are concentrated among low SES families. In the 2006 Census, 13% of family households did not state, or only partially stated, their income while 18% of people in the 25-54 age groups did not state their non-school qualifications.
ICSEA may also provide inaccurate measures of school SES because the version used to determine like school comparisons does not distinguish between families with and without school-aged students. This may have implications for the measurement of school SES in areas where there significant differences in the SES of families with and without school-age children. It may lead to some schools being defined as low SES because of high concentrations of pensioners and young unemployed in the area, while families with school-age children are well-off. In such cases, these schools would be incorrectly classified as low SES schools and wrongly compared to actual low SES schools when they should be compared with higher SES schools.
It is also a concern that ICSEA is already out of date because it is based on 2006 Census data and it will become even more out of date in the next few years. Census surveys are only carried out every five years and it takes up to two years for the SES data to become available after each Census. This means ICSEA will be seven years out of date by the time the data from the next Census becomes available. Significant social change can occur over such a period. For example, unemployment has increased significantly since the last Census due to the global financial crisis and this may affect the ICSEA ratings of schools.
Other differences in student composition are ignored
The My School comparisons of the test results of ‘like schools’ ignore other differences in the student composition of schools which have a significant effect on school results. While SES accounts for the large part of the influence of background factors on school performance, there are also other background factors which, if not taken into account, could invalidate like-school comparisons. These include differences in school composition by gender, ethnic sub-groups and students with disabilities.
Girls consistently achieve higher literacy results than boys. This is likely to be very significant in comparing outcomes in all boy schools and all girl schools with similar SES profiles. On average, all boy schools will tend to have lower levels of average literacy achievement than all girl schools with a similar SES composition.
It is also likely that, on average, all girl schools will have higher outcomes than schools with a similar SES profile with more or less equal proportions of female and male students. The male population of mixed schools is likely to lower their average achievement compared to all girl schools.
My School ignores differences in the ethnic composition of schools. For example, average test results of Chinese students are well above those of Middle Eastern and Pacific Islander students. Schools with a similar ICSEA rating could have quite different average results simply because some have a high proportion of students of Chinese origin while others have a high proportion of Middle Eastern or Pacific Islander students.
Differences in the proportion of students with disabilities between schools are also ignored. Schools with higher proportions of students with disabilities participating in tests may have lower average results than other schools with a similar ISCEA value. This omission further disadvantages government schools because they have higher proportions of students with disabilities than private schools.
Many other factors affecting school results are ignored
ICSEA omits a range of other factors that strongly influence school results. These include differences in funding, school size, student mobility between schools, student selection and private tutoring.
Schools with similar ISCEA values may have vastly different levels of funding which may contribute to differences in school results. For example, in 2007-08 recurrent expenditure per government secondary school student in Western Australia was 43% higher than in Victoria while expenditure per primary school student in the ACT was 48% higher than in Victoria. In addition, total funding per student in many high SES private schools is up to double or more that of high SES government schools.
My School does not take account of differences in schools size which can affect the comparison of school results. Very large schools tend to have lower average results than small to medium sized schools. Small schools are much more likely to report large changes in average results from one year to the next because their results can be heavily influenced by the results of only 4 or 5 students.
Some schools with a similar ICSEA rating may have lower results because they have a high proportion of students who often change school. Students who often move school often tend to have lower average results than other students.
The ‘like school’ comparisons on My School ignore student selection by many private schools and some government schools. Some schools may achieve higher results than others with a similar ICSEA rating because they can select higher achieving students and exclude lower achieving students.
Moreover, many families today engage private tutors for their children. Differences between schools in the proportion of families who resort to private tutoring may be reflected in differences in school results between so-called ‘like schools’.
In addition to all these flaws, comparison of test results of ‘like schools’ can also be distorted by schools manipulating and rorting their test results to artificially boost their rankings on school league tables.
ICSEA misleads about differences in school quality
Because of its flaws and omissions, ICSEA exaggerates the differences in quality between ‘like schools’ and thereby misleads those who choose schools or make policy decisions on the basis of these comparisons.
The socio-economic variables incorporated in ICSEA explain about 70% of the variation in aggregated primary school results, leaving about 30% implicitly attributable to differences in teaching, curriculum, pastoral care and other aspects of schools. If school SES were more accurately measured and if factors such as gender, ethnic sub-groups, students with disabilities, school funding, school size, student mobility, and student selection practices were included in ICSEA its explanatory power could increase to 85-90% of the variation in school results. This would leave only 10-15% of the variation in school results as explained by differences in school quality.
My School is a travesty of ‘like school’ comparisons. Its measure of the SES of schools is entirely inadequate. It needs a major overhaul. The systematic bias in favour of private schools generated by the use of an area-based SES measure can only be resolved by developing a consistent measure of individual family SES. ICSEA should also be revised to take account of differences between schools in gender, ethnic sub-groups, students with disabilities, school funding, school size, student mobility and student selection practices.
A better way of determining like schools should be investigated by an independent public review.