The future for Australia was on show this week as England went through its annual ritual hunt for the worst performing schools in the country.
Following publication of primary school test results on a government website, league tables of the best and worst schools in England were published by all the major newspapers.
Naming and shaming the worst schools was a feature of the reporting, as it has been for the last 15 years or more in England. For example, the Guardian and the Daily Mail highlighted the top and worst performing school in the country.
The BBC commentary on the results highlighted the top and two worst performing schools.
Many of England’s regional newspapers joined in the hunt. For example, the London Evening Standard published a table of the top 10 primary schools in London and the bottom 10 schools.
The Express and Star highlighted the worst performing school in the West Midlands. The Leicester Mercury named the top 3 and bottom 3 schools in the county. The Derby Evening Telegraph highlighted the top and bottom school.
It all shows what can be expected in Australia from next year when the results for all schools are published on the national My School website. Naming and shaming of schools with poor results will become a regular national and regional pastime. We can expect an annual media frenzy to find the best and worst schools in Australia.
This week also demonstrated yet again how easy it is for the media to use government provided data on school results to construct league tables. Various league tables were published by major national newspapers, the BBC and several regional newspapers on their websites using tables of school results listed in alphabetical order for each of 150 Local Authority areas by the UK Department of Children, Schools and Families.
The Times published a 296 page table of results for all primary schools in England ranked from top to bottom. It also published a separate table of the top 100 schools in England ranked according to their combined average score in English, mathematics and science for students who achieved at above the expected level (Level 5). The Times also provided a search facility for users to obtain the 5 top schools in their area
The BBC published a table of the top 205 primary schools in rank order by their combined average score in English, mathematics and science for students achieving at above Level 5. It also published a table of 196 primary schools with the worst results, with the worst at the top, based on their combined average score on the three tests.
The BBC also provided a table of schools (268) that achieved the maximum aggregate score possible. on the tests. This means that all their Year 6 students achieved at least the expected standard (Level 4) in each subject.
The Independent published “a table of the top 150 schools based on their combined average test score”: The Daily Mail published league tables of the top 200 schools, ranked according to their average scores in English, mathematics and science and by the average score for these tests.
The Guardian and the Daily Telegraph published full league tables of all schools in each local government area ranking schools from top to bottom on their combined average score for English, mathematics and science. Many regional newspapers such as the London Evening Standard, the Newcastle Evening Chronicle and the Derby Evening Telegraph also published league table rankings of all schools in each local government area in their region.
In each case, the results published electronically by the Department were simply re-configured by the media to construct a variety of national and regional league tables. In some cases, data management companies were employed to re-configure the data. For example, the Daily Telegraph league tables were prepared by Dalton Firth Limited, an IT and software consultancy firm that develops software for data base and web-based knowledge systems.
This gives lie to the assurances about not supporting league tables given by the Australia’s Education Minister, Julia Gillard, and by her state and territory counterparts. These assurances have been designed to mislead and amount to outright deceit. It will be a simple matter for major media outlets to employ special software programs to trawl through the individual school profiles on the ACARA website to collect the results of each school and arrange them in a league table.
We can expect to see an annual list of the top 10 and bottom 10, or more, schools in Australia and in each state and territory published in the media as occurs every year in England. We can expect various national, metropolitan, suburban and provincial newspapers to vie with each other about the type of league table they produce, just as they do in England.
The publication tables of school results by the Hobart Mercury and the Brisbane Courier-Mail earlier this year are just an indication of what is to come. Indeed, Gillard is making it easy for suburban and provincial newspapers to compile their own local area league tables because the results of each school in each local area can be directly linked from the My School website.
The fact is that publishing school results means league tables and public highlighting of the top and worst performing schools. This helps sell newspapers.
The experience this week with reporting school results in England also brings into question Julia Gillard’s claim that so-called ‘like school’ comparisons will ensure contextual reporting of results so that schools serving high income areas are not unfairly compared to schools in low income areas.
England has its own version of school results adjusted for differences in student composition between schools. The UK Department of Children, Schools and Families publishes so-called ‘contextual value added’ scores which adjust raw school results for the impact of external factors including student mobility, ethnic background and disadvantage to obtain a relative measure of the ‘school effect’ on student learning.
These tables were largely ignored by the English media, which, as in other years, focused on the average results of schools, showing little interest in the impact of the social background of students on school results. Only the Independent and the Daily Mail of the national papers used the ‘contextual value added’ tables to highlight the top performing school. The Independent published a table of the top 50 schools ranked according to ‘value added’. The Guardian did a profile on the top performing school by ‘value added’, a school located in one of the most deprived areas of Nottingham.
Yet another lesson from this week is that league tables distort teaching priorities in schools.
The latest test results in England show a decline in the proportion of students attaining Level 5 scores, the level above expected achievement in Year 6, in English and science in the last few years. The Times analysis of the results said that it suggested that high achieving students are being neglected as teachers concentrate on raising the attainment of borderline pupils performing at just below national standards:
This has fuelled suspicion that schools, under pressure to impress in league tables, are channelling their resources in trying to get children above the Level 4 threshold, rather than stretching the brightest pupils.
This suspicion is confirmed by much research in England and the United States which shows that as schools focus on obtaining a good ranking on league tables they concentrate on improving the results of students just below benchmark standards and neglect very low and high performing students.
The Times also noted complaints about increasing time being devoted to test preparation at the expense of subjects which are not tested. There is extensive evidence of this. For example, a major review of the UK curriculum released last October found that the testing and reporting of school results in English and maths has distorted children’s learning and eroded their entitlement to a broad education. It said that 10 and 11-year-olds spend around half their time in the classroom studying English and maths and this has “squeezed out” other subjects from the curriculum.
So, this is another consequence of league tables we can expect in Australia once the My School website is operational. We can expect a narrowing of the curriculum and a diminished education for children as schools are forced to focus on tested subjects to improve or maintain their ranking.
A final observation on England’s results is that they undermine Gillard’s claim that publishing school results will lead to improved student and school achievement.
Despite all the manipulation of school results and teaching to the test that goes on now in English schools many did worse than last year. Only 47% of schools in England achieved a better performance on all three tested subjects compared to last year and 51% did worse. More schools also had fewer than 50% of their students achieving the expected standard in English and mathematics – 885 this year compared to 798 in 2008.
These results, together with the evidence of the narrowing of the curriculum, have caused many to question the value of league tables, including some advocates. For example, the Times noted that school results have stagnated in England, or are even starting to decline, and commented that this “raises questions about whether all the effort is worthwhile”. The National Association of Head Teachers and the National Union of Teachers in the UK have decided to boycott the tests next year unless they are scrapped because of the disastrous impact on schools and the education of children.
Gillard should look at the evidence on reporting school results and league tables in England. There is no substantial research to show that league tables have improved student achievement. Far from improving school results, children receive a narrowed education which values rote learning over inquiry and understanding.
Unless there is a national boycott of the NAPLAN tests by principals, teachers and parents, an English education is now the future for Australia.