The Prime Minister and the Federal Minister for Education, Julia Gillard, have stated that a system of individual school performance reporting will be included in a new national education agreement due to start at the beginning of 2009.
The Minister for Education has said that she doesn’t want “silly” or “simplistic” league tables. She has rejected comparisons of the raw scores of all schools as unfair and misleading. Instead, she supports comparisons of ‘like’ schools as a way to improve school performance.
To date, the Minister has not provided any details or explanation of how like school comparisons will be made and what information will be published. All we know is that she is impressed by the model of school reporting used in New York City. So, clearly, this is one model being considered by the Government.
There are also several models of comparing student achievement in like schools in place at present in Australia. New South Wales, Victoria and Western Australia have been using like school comparisons for some time. Each state uses a different methodology for identifying like schools and there are differences in the comparative data provided to schools. New South Wales and Western Australia allow schools to publish some of the data.
This note is the first in a series about like school comparisons. It provides a description of the systems in place in New York and in Australia. Further notes in the series will analyse various features of these models and discuss the implications of introducing like school comparisons across Australia.
New York City
The New York City Department of Education publishes annual school Progress Reports that provide teachers, principals, and parents with detailed information about students and schools. The first Progress Reports were published in November 2007 for the 2006-07 school year. A citywide Progress Report summarising the results of all schools is also published.
Progress Reports give each school a score on its performance in three domains – School Environment, Student Achievement and Student Progress (and on several measures in each category). These scores are combined to give an overall score out of 100 for each school. Schools are also given a grade of A, B, C, D or F based on their domain scores and how they compare with other schools.
The various scores in each domain and the school grades are published in a Citywide Progress Report and individual school reports, all of which are available on the Department of Education website.
The Progress Reports evaluate schools in three areas:
- School Environment (15% of score), including attendance and the results of Learning Environment Surveys;
- Student Performance (30% of score), as measured by elementary and middle school students’ scores each year on the New York State tests in English Language Arts and Mathematics. For high schools, student performance is measured by diplomas and graduation rates; and
- Student Progress (55% of score), as measured by how much schools help students progress during the school year in subjects such as reading, writing, math, science, and history. Schools’ progress scores also rise when they help English Language learners, special education students and students who are not performing well at the beginning of the school year.
Schools also receive additional recognition for exemplary progress by students most in need of attention and improvement. This can increase a school’s overall grade.
A school’s results in each domain are compared to results of all schools of the same type (elementary, middle and high schools) across the City. Results are also compared to a peer group of 40 similar schools. These comparisons with other schools are reported as the percentage of the distance between the lowest and highest scores achieved by each school.
The citywide and peer school ranges are determined on the basis of the past two years results for elementary and middle schools and in the past four years for high schools (the reference period).
A score of 50% on a particular measure in means that the school’s performance on that measure in the current year was exactly halfway between the bottom and top scores in the citywide or its peer group range during the previous two or four years. Similarly, 75% signifies that the school’s score was three-quarters of the distance between the bottom and top of that range. Scores above 100% are possible if in the year of the Progress Report the school exceeds the top score in the reference period range.
Peer schools are schools that serve similar populations in terms of grade span, demographic composition, and/or average incoming State exam scores. A school’s peer group consists of the twenty schools above and twenty schools below it in the same grade span category when ranked by a “peer index”. The peer index of each school is reported in the individual school Progress Reports and the overall citywide Progress Report.
Different types of school are ranked by different peer indexes. Peer group K-5 and K-8 schools are determined by their student profile while peer groups of 6-8 schools and high schools are determined by student performance on state-wide English and mathematics exams.
The peer index used for schools in the K–5 or K–8 grade span is the weighted average of the percentage of students at the school eligible for free lunch (the Title I poverty rate) (40%), percentage of Black and Hispanic students (40%), percentage of the student population enrolled in Special Education (10%), and percentage of the student population made up of English Language Learners (10%). The index value is from 0-100 and a high value reflects a high need student population with a high percentage of students from low income, Black or Hispanic families.
The index for schools in the 6–8 grade span group is the average of the Proficiency Ratings its actively enrolled students had earned on their fourth grade State ELA and mathematics exams. The index for high schools is the average of the Proficiency Levels its actively enrolled students had earned on their State ELA and mathematics exams as 8th graders. The index value is from 1-4.5 and a high value in this case indicates a student population with low need.
New South Wales
Like school comparisons for government schools was introduced in NSW in 2005 as part of a revised system of annual reporting by schools. The system was initially introduced on a trial basis.
Schools can report academic achievement against the average for their ‘Like School Group’ (LSG), as well as the State, in their annual report. The decision to report against LSG and/or the State is optional for the school. The Department of Education does not publish the data for each school.
Schools can report the following against the LSG and/or State:
- percentage of students in each skill band (Year 3 & 5 Literacy and Numeracy) and/or each achievement level (Year 7 Literacy and Numeracy) and/or each performance band (School Certificate);
- relative performance of the school compared to the school average over time;
- the average progress in literacy and numeracy for matched students (value-added) for Years 3-5 and/or Years 5-10 and/or Years 10-12;
- the school average score for School Certificate subjects compared to the school average over time;
- the school mean score for all HSC subjects compared to the school mean over time.
Many schools publish the average score of students for Year 3, 5 and 7 literacy and numeracy tests, the average score for School Certificate subjects and the average score for all HSC subjects. They also publish corresponding averages for the like school group and the state average. Schools also publish the proportion of students performing at different proficiency levels for their school, the like school group and for the state as a whole.
All NSW government schools are allocated to one of nine LSGs based on the average socio-economic status (SES) of the school community and the school’s geographic isolation. There are four metropolitan LSGs differing in SES and five rural LSGs differing in SES and remoteness. Selective schools are placed in a separate LSG.
The school SES is determined by geo-coding the addresses of all of its students and allocating them to ABS Census Collection Districts (CDs). The Socio-Economic Indexes for Areas (SEIFA) Index of Disadvantage value associated with the CD is assigned to each address and the average value of all the student addresses is the school SES score. The school geographic isolation measure is based on the Accessibility/Remoteness Index of Australia (ARIA).
The Victorian Education Department has been using data to compare the performance of ‘like schools’ since 1996. A new system of comparisons called Percentile/SFO Comparison Charts was piloted in 2007 and the results distributed to schools in 2008. These comparisons are not publicly reported.
Under the previous system, like school groups were identified according to the proportion of students who receive the Education Maintenance Allowance or Commonwealth Youth Alliance and the proportion of students from a Language Background other than English. All government schools were divided into nine ‘like school groups’ according to the background characteristics of their students.
The new system uses SES as a benchmark for reporting and assessing school performance. Schools compare their actual performance with that predicted by their SES and that of similar schools. This is done by comparing the percentile rank of a school’s average test results for each Year level with the SES percentile range of other schools most like the school.
The SES of each school is measured by Student Family Occupation (SFO) density which is derived from information about parent occupations provided on student enrolment forms. The data is based on five occupational groups and these are each given a different weighting. The higher is a school’s SFO density, the lower is its SES; that is, it has a high proportion of its students from families with less skilled occupations.
This system of comparisons is based on the assumption that if SES, as measured by the SFO, was the sole determinant of student achievement, a school’s achievement percentile would be expected to be similar to its SFO percentile. In effect, the SFO percentile range is used as a predictor of a school’s test results.
The exact average test scores and the actual SFO density are not directly reported to schools, each being converted to a percentile rank. Nor is the SFO percentile of individual school presented. The Department of Education provides access to the data for individual schools but it is not publicly available. Schools are not required to report student achievement against other like schools.
Individual schools can compare their performance to that predicted by their SFO by plotting the percentile of each absolute average test score against the SFO density of schools whose SFO percentile range is +/-10% of that of their own school [see Attachment 1]. For example, a school’s Year 3 average reading score may be at the 20th percentile, meaning that 20% of schools have a lower average score, that is, the school has a higher average test score in Year 3 reading than 20% of all schools. If the SFO percentile range of like schools is 12-32%, the school could said to be performing at the level predicted by the SFO percentile range of like schools. However, if the SFO percentile range of like schools is 25-45%, the school could said to be performing below the level predicted by the SFO percentile range of like schools.
Like school comparisons of student achievement are provided to Western Australian government schools as part of what is called the ‘Data Club’. The Data Club was developed as a program to enable school principals and teachers to analyse their students’ achievement scores on the Western Australian Literacy and Numeracy Assessment (WALNA). It allows schools to compare their performance with that of like schools and to identify the value-added component of their school. The information is only provided to schools, but many of them include broad comparative results in school annual reports.
Schools are grouped into ‘like-school’ bands based on their socio-economic index score. There are nine bands. The most disadvantaged schools are in Band 0 and the most advantaged schools are in Band 8. Band 4 schools are ‘average’.
Schools are provided with their mean and median scores, as well as the distribution of scores, for the WALNA tests in literacy and numeracy for Years 3, 5 and 7 together with those for their like school group, schools in their District and with the State as a whole. They are also provided with their average scores over time. A set of charts shows a school’s average scores and the distribution and range of scores compared with their like school group, District and State. This enables schools to compare the difference between their highest and lowest scores with that of other schools as well as the extent to which student scores are clustered around the school average. Many schools currently report this information in their school annual reports.
Like schools are determined on the basis of their score on a socio-economic index constructed by the Department of Education. The index is based on ABS Census data at the CD level and on the actual schools’ percentages of Aboriginal students. It is comprised of the double weighted dimensions of education, occupation and Aboriginality, and the single weighted dimensions of family income and single parent families.
The SES score for each school is derived by geo-coding the addresses of students at each school to CDs. The index was initially calculated using a sample of addresses in each school but now all student addresses are used. The number of students for each CD is determined and the ABS Census data is used to calculate an SES score for each CD by combining index scores for each dimension. The SES score for each school is obtained by weighting each CD SES score by the number of students resident in each CD and taking account of the percentage of Aboriginal students in each school.