The comparisons of the test results of so-called like schools on My School are biased in favour of private schools. This occurs because it uses a flawed measure of the socio-economic status (SES) of schools to group so-called “like schools” which systematically over-estimates the level of socio-economic disadvantage in private schools and under-estimates disadvantage in government schools.
Consequently, My School compares the test results of higher SES private schools with lower SES government schools, thereby making private school results look better than their “like” government schools.
This systematic bias in favour of private schools is likely to be further compounded by missing Census data. The Index of Community Socio-Educational Advantage (ICSEA) used to measure school SES is based on several variables drawn from Census data. If some relatively high non-response rates to key Census questions are concentrated among low SES families ICSEA is likely to underestimate the level of disadvantage in government schools, causing them to be compared with higher SES private schools.
Not all people return Census forms or fully answer Census questions and the response rates vary for different questions. The non-response rates for the factors used to construct ICSEA vary considerably, but are significant for key questions.
In particular, 13% of family households did not state their income or only partially stated their income while 18% of people in the 25-54 age groups did not state their non-school qualification and 8% of this age group did not state their highest level of school education. In comparison, only 2% of people in the 24-54 age groups did not state their occupation and 5% did not state their language and proficiency in English.
There does not appear to be any direct evidence on the characteristics of those who failed to fully or partially answer these questions. However, there is some evidence to suggest that non-responses to surveys requesting information on income and education qualifications tend to be concentrated amongst low SES families.
For example, there is a high non-response rate to questions on family income and occupations on school enrolment forms. According to the 2009 report on the National Assessment Program for Literacy and Numeracy, 17 to 25% of families of students at different Year levels do not provide parent education and occupation information. The average literacy and numeracy results for students of these families (grouped as “non-stated”) are generally low and similar to those of students whose parents only completed Year 11 or Year 12 and work in low skilled occupations.
If those who did not state their income, non-school qualification and highest level of school education in the Census are also from low SES households, the proportion of these families nationally and in different regions will be under-estimated. It means a significant proportion of low SES families are excluded from the data used to measure school SES and group “like schools”.
This could have significant implications for the comparison of school results on My School.
First, it is likely that the overall strength of the relationship between low SES and student outcomes is under-estimated as is the variance in aggregated school outcomes explained by the variables used to construct ICSEA. That is, ICSEA under-estimates the variation in student outcomes explained by socio-economic factors, thereby over-estimating the variation in student outcomes explained by in-school factors. This would mean that ISCEA exaggerates the differences in student results between so-called “like schools”.
Second, it also likely that the number of families with low income, education and qualifications in low SES areas is under-estimated so that the average SES of these areas is also under-estimated, the latter being the weighted average of all the families in the area, including higher SES families, who responded to the Census questions.
In this event, the ICSEA ratings of government and private schools serving these low SES communities would be over-estimated, that is, the level of disadvantage in these schools is under-estimated. Significantly different response rates between low SES areas could result in some schools being given a higher rating than warranted and being wrongly compared to schools with a higher ICSEA rating.
The Australian Curriculum, Assessment and Reporting Authority does not appear to have investigated the potential implications of missing Census data for the ICSEA ratings of schools and comparisons of like schools. It does not appear to have done any sensitivity analysis on the ratings to establish their robustness in the presence of the missing data.
The possibility of further bias in the comparison of government and private schools because of missing Census data is a further reason for a full public independent inquiry into the validity of so-called “like school” comparisons on My School.
Trevor Cobbold