The current system of comparing the literacy and numeracy test results of so-called “like schools” on the My School website is systematically biased against government schools because the results of higher socio-economic status (SES) private schools are compared with lower SES government schools.
This bias occurs because the measure of school SES is based on the average of geographical areas in which students live rather than on their family SES. Some high SES families live in low SES areas, but their children are more likely to attend private schools than government schools. They carry their low area SES rating with them and this causes the SES of private schools they attend to be under-estimated.
It also causes the SES of government schools to be over-estimated because it is based on the area average which includes high SES families who do not attend government schools. So, the SES of private schools is systematically under-estimated and that of government schools is systematically over-estimated.
My School thus compares the results of dissimilar schools. As higher student achievement is strongly associated with higher SES, comparing the results of higher SES private schools with lower SES government schools makes private schools look better.
In light of this criticism, the Australian Curriculum, Assessment and Reporting Authority (ACARA) is investigating using student level data on parent education and occupation obtained from school enrolment forms to provide a more accurate measure of school SES and avoid classifying dissimilar schools as “like schools”.
In principle, it is preferable to use individual family/student data rather than area-based data to measure school SES. However, there are practical problems in using information from enrolment forms which are likely to also create systematic bias against government schools in “like school” comparisons.
There are high non-response rates on the parent education and occupation questions on enrolment forms. They vary considerably from state to state and by school sector according to NAPLAN reports. They have also been highly volatile from year-to-year. Non-responses increased from 30–37 per cent of all school parents in 2007 to 40–47 per cent in 2008 and then declined to 17–25 per cent in 2009.
These high non-response rates and their volatility raise serious questions about the validity of using this data to measure school SES. It is difficult to understand how non-response rates can increase significantly in one year and then decline by so much in the following year.
The non-responses appear to be highly concentrated amongst lower education and occupation groups. This is demonstrated by comparing the NAPLAN results for students whose parent education and occupation is not stated with those for whom it is stated. For example, the average mean scores in the 2009 NAPLAN tests for students in the non-response group were below those for students whose parents only completed Year 12 or its equivalent. Their mean scores were also similar to students whose parents were in low skilled occupations.
As the missing data appears to be concentrated amongst low SES families, using school enrolment data to measure the SES of schools is likely to significantly over-estimate the SES of government schools because low SES students comprise a much larger proportion of their enrolments than in private schools. Many government schools will be incorrectly classified as “like schools” with higher SES private schools and their average results then unfavourably compared with these higher SES private schools.
Thus, exactly the same biases as now exist in the comparison of so-called “like” government and private schools will continue.
It is also possible that the biases could even be worse than at present in some common circumstances. For example, school SES could be well over-estimated in government schools with a very large proportion of students from low SES families and many of whom do not respond to the questions on education and occupation. There can be no assurance that using student enrolment data will improve accuracy in the measurement of school SES.
There are also other issues arising from the use of enrolment form data which may bias or distort “like school” comparisons. One issue is that students from low SES families frequently change school but it will not be reflected in changes in the SES of the exited and receiving schools where there are high non-response rates from low SES families. School SES will remain unchanged even if a large proportion of low SES students move from one school to another.
“Like school” comparisons on My School will remain fatally flawed and misleading for as long as non-responses to enrolment form questions remain substantial, either generally or for a significant proportion of schools. “Like school” comparisons should be discontinued while the data remains so inadequate.
At the very least, if school enrolment data is used to construct the SES measure for some or all schools for My School 2010, the non-response rates should be reported for each school for each Year level. In addition, a caveat should be entered in each school report that the “like school” comparisons may involve comparisons of dissimilar schools because of high non-response rates where they occur.
This is a summary of an Education Policy Brief published by Save Our Schools: