New Study Shows that Public Schools Do Better Than Private Schools in Maths

A new US study has found that public school students out-perform private school students in maths. The higher performance by public schools was attributed to more certified maths teachers and a modern maths curriculum and teaching. Continue reading “New Study Shows that Public Schools Do Better Than Private Schools in Maths”

Media Release – New National Education Goals Are Fatally Contradictory

Tuesday January 27, 2009

A policy brief issued today by the public education advocacy group, Save Our Schools, claims that the new national education goals promulgated at the end of 2008 are fatally contradictory and fail on equity.

SOS national convenor, Trevor Cobbold, said the national commitment to reporting school results will work against improving equity in education.

“The new school year opens to follow new national education goals that will create a Janus-like school system tugged in different directions. Reporting school results and improving equity in education are incompatible goals.

“Reporting the results of individual schools makes the publication of league tables of school results inevitable. It will entrench choice and competition between schools as the fundamental organizing feature of school systems in Australia.

“The international evidence shows that equity in education is diminished where choice and competition rule. It leads to increased social segregation between schools as better-off parents use league tables to ‘vote with their feet’, as the PM says the system is designed to do. Increasing concentrations of students from low socio-economic status families in some schools tend to lead to lower average results and increase the achievement gap between rich and poor students.

“The new Melbourne Declaration of national goals for education means that the already large achievement gap between students from high and low income families in Australia will widen.”

Mr. Cobbold said that the contradictory signals of the Melbourne Declaration are compounded by its weaker commitment to improving equity.

“The SOS Policy Brief shows that the new Declaration weakens the previous Adelaide Declaration in three ways:

  • It removes the key goal of achieving social justice in schooling;
  • It increases the emphasis on equity in access to education and reduces the emphasis on achieving equity in student outcomes; and
  • It weakens the commitment to eliminating achievement gaps between students from different social groups.

“Dropping the previous goal of achieving social justice in schooling is symbolic of the retreat on equity in the new national goals.”

Mr. Cobbold said that the introduction of league tables of school results completes the Howard Government’s program to develop a national market in school education based on English and US models.

“Australian governments, led by the Rudd Government, have chosen to follow the example of the UK and the US – whose school systems generally perform worse than Australia – while ignoring the example of school systems that perform better. The highest achieving countries such as Finland and Korea don’t publish comparisons of school results.

“The Rudd Government has achieved what former Howard Government Education Minister, David Kemp, could only aspire to. Labor’s supposed revolution in education is one conceived by Kemp.”

28 January 2009

Contact: Trevor Cobbold 0410 121 640 (m)

Download the Policy Brief – New National Goals for Education Fail on Equity

SOS – Fighting for Equity in Education
http://www.soscanberra.com/

Previous Next

‘Hot Housing’ Students to Improve League Table Rankings

Sunday January 18, 2009

The publication of new league tables of secondary school results in England last week brought to light another way of manipulating school results.

Secondary schools are encouraging students to take exams early in order to give them a chance to repeat the exam if their results are not good enough. The practice is called ‘hot housing’ students to get better results.

The Guardian newspaper (15 January) reported that many schools are allowing students to take the same General Certificate of Secondary Education (GCSE) exam up to three times in the space of a year in order to improve their results. Examination boards have reported substantial increases in the number of students who took their GCSE exams early in order to allow time for a re-sit of the exam in the case of failure.

According to the report, some students have taken the same GCSE exam up to three times in the one year in order to improve their results. Many are taking their exams up to six months early in order to allow time for a re-sit. The exam board, Edexcel, said it had seen a 67% rise in the number of students taking a whole GCSE early, while the number re-taking modules to improve their scores had nearly doubled.

The Guardian report stated that this tactic is being encouraged increasingly across England by school principals desperate to improve their school’s ranking in the league tables. Teachers’ leaders blamed Government pressure on school principals to improve results and move up the school league tables for the increased focus on exams that was putting children under stress and detracting from the depth of their learning.

Alan Smithers, Professor of Education at Buckingham University, told the Guardian that: “League tables have got all out of proportion and schools will now do all they can to improve their place. Early entry is one way they are doing it. Other ways include focusing on the pupils on the C-D border. We’re in danger of producing a set of statistics that no longer accurately reflect pupils’ progression but the work the schools can do to improve their scores.”

‘Hot housing’ students for literacy and numeracy tests used to judge school performance is now a feature of most United States school systems. Classrooms are being turned into test preparation factories. Weeks and months can be devoted to test preparation at the expense of other parts of the curriculum and other learning areas. Even recess time has been cut or eliminated in many schools so that more time can be given to test preparation.

Reporting individual school results inevitably leads to league tables and immense pressure on schools and teachers to improve school rankings. The evidence is that schools look for ‘quick fixes’ to improve their ranking. ‘Hot housing’ students by repeated test preparation and re-taking of exams is just one among many ways schools try to manipulate their results.

Other well-established practices in England and the United States include removal of low-achieving students from tests by placing them in special education streams, suspending them or encouraging them to be absent on test days and retaining them in grades not tested. The incidence of teachers helping students during tests or changing answers has been shown to increase following the introduction of reporting school results.

Another way to artificially boost school results is to make greater use of special dispensations to help students during tests. They include having extra time for tests, use of a dictionary, small group or one-on-one testing, oral reading of directions and use of an aide for transcribing responses. They also include special allowances in the case of illness such as alternative assessments and sitting tests at different times from other students.

The New York City school reporting system, so admired by Julia Gillard, has encouraged schools to use special dispensations for students during tests to improve their results. There has been a massive increase in so-called “accommodations” for students taking tests since the new reporting system was introduced.

This practice is already being used in Australia as shown by last year’s revelations in the Sydney Morning Herald about excessive use of dispensations for the HSC exam by some NSW elite private schools. Up to 30% of students at some private schools were given special provisions in the 2008 HSC, compared with an average 7% of government high school students. Masada College claimed special dispensations for 30% of its students and Scot’s College claimed them for 25% of its students. In 2006, Reddam House received special consideration for 36% of its HSC students.

All this is a harbinger of what can be expected across Australia with the decision of Australian Governments last year to go ahead with reporting of individual school results. Manipulation of school results will become a priority for schools instead of real learning in the classroom.

Trevor Cobbold

Previous Next

New study shows that competition and choice do not raise student achievement

At the heart of the Rudd Government’s policy of reporting individual school results is an assumption that competition between schools will raise student achievement. The theory is that reporting school results will better inform parent choice of schools and the competition between schools for enrolments will act as an incentive for schools to improve student achievement. Continue reading “New study shows that competition and choice do not raise student achievement”

Media Release – “Creative Confusion” By Mr Klein

Sunday November 23, 2008

Save Our Schools today challenged the claims of visiting New York City Schools Chancellor, Joel Klein, about large increases in school performance shown in newly published report cards for the City’s schools.

SOS convenor, Trevor Cobbold, said that the results strain credibility and there are suspicions that school grades have been manipulated to boost results.

“Mr. Klein has said that his approach to managing the New York City public education system was to cause ‘some creative confusion’. His new school report cards have certainly succeeded in creating confusion about the real state of schools in the City. The New York Times said that many New Yorkers are ‘somewhat befuddled’ by inconsistencies with other test results for the City’s schools.

“The huge increase in the number of schools being graded as A is hard to believe. The new progress report show that the number of schools graded as A increased by 80% over the previous year and 70 per cent of schools that failed (F) last year received an A or B. A Columbia University academic has described these changes as ‘magical transformations’.

“Major inconsistencies between City, NY State and Federal assessments of the same schools have been revealed by the New York Times (16 September 2008). For example:

  • Two elementary schools that received an A in Klein’s report cards were added to the New York State’s list of failing schools this year.
  • In over 60 of the 394 elementary schools rated A by Klein’s report cards, more than half the students failed to reach proficiency on the New York State’s reading test.
  • 30% of the elementary schools deemed failures under the Federal No Child Left Behind Act received an A in the report cards, while 16 of the 18 schools given an F are rated satisfactory under Federal guidelines.

“The only independent check on student achievement in New York City shows a completely different picture from that claimed by Klein,” said Mr Cobbold.

“The results of the National Assessment of Education Progress administered by the US Department of Education show that student achievement in New York City has stagnated since 2003. The achievement gaps between Blacks and Whites, between Hispanics and Whites and between low and high income students are as large as they were when Klein began to overhaul the system.”

Mr Cobbold said there was added confusion because the cut-off scores for each letter grade were reduced for the 2007-08 tests (see attached table).

“The reductions in cut-off scores have raised suspicions that the data was manipulated to artificially boost the results.

“This year, 79% of elementary schools received an A or B compared to 71% last year and 83% of high schools received an A or B compared to 65% last year. Virtually all of the increase in elementary schools and about two-thirds of the increase for high schools appears due to the reduction in grade cut-off scores. If last year’s cut-off points had been used, only 72% of elementary schools would be rated as A or B, almost no change from last year, and only 71% of high schools.

“The reduction in cut-off scores is not even mentioned in the list of changes to the school progress reports appended to the new technical guides to the reports published by the Department of Education. The guides for last year’s reports stated the cut-off scores ‘will be used for the next several years’. They lasted only one year before being revised down.

“Clearly, Mr. Klein has some explaining to do before his claims can be taken seriously.”

Mr Cobbold warned against adopting the New York City’s school reports in Australia.

“The New York City system of school reports lacks credibility and reliability. US education experts have criticised it variously as ‘inherently unreliable’, ‘dubious’, ‘baroque’ and producing ‘bizarre results’. Its methodology is so arcane and arbitrary that it is open to manipulation in a variety of ways to artificially boost results.

“Adopting such a model in Australia would lead to inaccurate and misleading comparisons of school performance. Experience with publishing school results elsewhere in the United States and England shows that they increase social segregation and inequity in education and stigmatise low income and ethnic students as failures.

“Publication of school results is heavily criticised in England for these reasons. Wales and Northern Ireland have stopped publishing school performance tables in recent years because they are unreliable and inaccurate measures of school quality and create perverse incentives.

“Governments all around Australia know where the problems are in our schools. We don’t need a reporting system which has already failed to prove its worth to find this out. What is needed is a real commitment of resources to disadvantaged students and schools.”

24 November 2008

Contact: Trevor Cobbold 0410 121 640 (m)

Download table of NYC school grade cutoff scores

Note: The cut-off scores in the attached table refer to the overall rated score out of 100 achieved by a school on a range of measures covering School Environment, Student Achievement and Student Progress. Source: New York City Department of Education. Progress Report Educator Guides

The 2007-08 Guides can be directly accessed while the 2006-07 Guides can be obtained by using the site’s search engine.

Previous Next

Models of Like School Comparisons

Sunday November 9, 2008

The Prime Minister and the Federal Minister for Education, Julia Gillard, have stated that a system of individual school performance reporting will be included in a new national education agreement due to start at the beginning of 2009.

The Minister for Education has said that she doesn’t want “silly” or “simplistic” league tables. She has rejected comparisons of the raw scores of all schools as unfair and misleading. Instead, she supports comparisons of ‘like’ schools as a way to improve school performance.

To date, the Minister has not provided any details or explanation of how like school comparisons will be made and what information will be published. All we know is that she is impressed by the model of school reporting used in New York City. So, clearly, this is one model being considered by the Government.

There are also several models of comparing student achievement in like schools in place at present in Australia. New South Wales, Victoria and Western Australia have been using like school comparisons for some time. Each state uses a different methodology for identifying like schools and there are differences in the comparative data provided to schools. New South Wales and Western Australia allow schools to publish some of the data.

This note is the first in a series about like school comparisons. It provides a description of the systems in place in New York and in Australia. Further notes in the series will analyse various features of these models and discuss the implications of introducing like school comparisons across Australia.

New York City

The New York City Department of Education publishes annual school Progress Reports that provide teachers, principals, and parents with detailed information about students and schools. The first Progress Reports were published in November 2007 for the 2006-07 school year. A citywide Progress Report summarising the results of all schools is also published.

Progress Reports give each school a score on its performance in three domains – School Environment, Student Achievement and Student Progress (and on several measures in each category). These scores are combined to give an overall score out of 100 for each school. Schools are also given a grade of A, B, C, D or F based on their domain scores and how they compare with other schools.

The various scores in each domain and the school grades are published in a Citywide Progress Report and individual school reports, all of which are available on the Department of Education website.

The Progress Reports evaluate schools in three areas:

  • School Environment (15% of score), including attendance and the results of Learning Environment Surveys;
  • Student Performance (30% of score), as measured by elementary and middle school students’ scores each year on the New York State tests in English Language Arts and Mathematics. For high schools, student performance is measured by diplomas and graduation rates; and
  • Student Progress (55% of score), as measured by how much schools help students progress during the school year in subjects such as reading, writing, math, science, and history. Schools’ progress scores also rise when they help English Language learners, special education students and students who are not performing well at the beginning of the school year.

Schools also receive additional recognition for exemplary progress by students most in need of attention and improvement. This can increase a school’s overall grade.

A school’s results in each domain are compared to results of all schools of the same type (elementary, middle and high schools) across the City. Results are also compared to a peer group of 40 similar schools. These comparisons with other schools are reported as the percentage of the distance between the lowest and highest scores achieved by each school.
The citywide and peer school ranges are determined on the basis of the past two years results for elementary and middle schools and in the past four years for high schools (the reference period).

A score of 50% on a particular measure in means that the school’s performance on that measure in the current year was exactly halfway between the bottom and top scores in the citywide or its peer group range during the previous two or four years. Similarly, 75% signifies that the school’s score was three-quarters of the distance between the bottom and top of that range. Scores above 100% are possible if in the year of the Progress Report the school exceeds the top score in the reference period range.

Peer schools are schools that serve similar populations in terms of grade span, demographic composition, and/or average incoming State exam scores. A school’s peer group consists of the twenty schools above and twenty schools below it in the same grade span category when ranked by a “peer index”. The peer index of each school is reported in the individual school Progress Reports and the overall citywide Progress Report.

Different types of school are ranked by different peer indexes. Peer group K-5 and K-8 schools are determined by their student profile while peer groups of 6-8 schools and high schools are determined by student performance on state-wide English and mathematics exams.

The peer index used for schools in the K–5 or K–8 grade span is the weighted average of the percentage of students at the school eligible for free lunch (the Title I poverty rate) (40%), percentage of Black and Hispanic students (40%), percentage of the student population enrolled in Special Education (10%), and percentage of the student population made up of English Language Learners (10%). The index value is from 0-100 and a high value reflects a high need student population with a high percentage of students from low income, Black or Hispanic families.

The index for schools in the 6–8 grade span group is the average of the Proficiency Ratings its actively enrolled students had earned on their fourth grade State ELA and mathematics exams. The index for high schools is the average of the Proficiency Levels its actively enrolled students had earned on their State ELA and mathematics exams as 8th graders. The index value is from 1-4.5 and a high value in this case indicates a student population with low need.

New South Wales

Like school comparisons for government schools was introduced in NSW in 2005 as part of a revised system of annual reporting by schools. The system was initially introduced on a trial basis.

Schools can report academic achievement against the average for their ‘Like School Group’ (LSG), as well as the State, in their annual report. The decision to report against LSG and/or the State is optional for the school. The Department of Education does not publish the data for each school.

Schools can report the following against the LSG and/or State:

  • percentage of students in each skill band (Year 3 & 5 Literacy and Numeracy) and/or each achievement level (Year 7 Literacy and Numeracy) and/or each performance band (School Certificate);
  • relative performance of the school compared to the school average over time;
  • the average progress in literacy and numeracy for matched students (value-added) for Years 3-5 and/or Years 5-10 and/or Years 10-12;
  • the school average score for School Certificate subjects compared to the school average over time;
  • the school mean score for all HSC subjects compared to the school mean over time.

Many schools publish the average score of students for Year 3, 5 and 7 literacy and numeracy tests, the average score for School Certificate subjects and the average score for all HSC subjects. They also publish corresponding averages for the like school group and the state average. Schools also publish the proportion of students performing at different proficiency levels for their school, the like school group and for the state as a whole.

All NSW government schools are allocated to one of nine LSGs based on the average socio-economic status (SES) of the school community and the school’s geographic isolation. There are four metropolitan LSGs differing in SES and five rural LSGs differing in SES and remoteness. Selective schools are placed in a separate LSG.

The school SES is determined by geo-coding the addresses of all of its students and allocating them to ABS Census Collection Districts (CDs). The Socio-Economic Indexes for Areas (SEIFA) Index of Disadvantage value associated with the CD is assigned to each address and the average value of all the student addresses is the school SES score. The school geographic isolation measure is based on the Accessibility/Remoteness Index of Australia (ARIA).

Victoria

The Victorian Education Department has been using data to compare the performance of ‘like schools’ since 1996. A new system of comparisons called Percentile/SFO Comparison Charts was piloted in 2007 and the results distributed to schools in 2008. These comparisons are not publicly reported.

Under the previous system, like school groups were identified according to the proportion of students who receive the Education Maintenance Allowance or Commonwealth Youth Alliance and the proportion of students from a Language Background other than English. All government schools were divided into nine ‘like school groups’ according to the background characteristics of their students.

The new system uses SES as a benchmark for reporting and assessing school performance. Schools compare their actual performance with that predicted by their SES and that of similar schools. This is done by comparing the percentile rank of a school’s average test results for each Year level with the SES percentile range of other schools most like the school.

The SES of each school is measured by Student Family Occupation (SFO) density which is derived from information about parent occupations provided on student enrolment forms. The data is based on five occupational groups and these are each given a different weighting. The higher is a school’s SFO density, the lower is its SES; that is, it has a high proportion of its students from families with less skilled occupations.

This system of comparisons is based on the assumption that if SES, as measured by the SFO, was the sole determinant of student achievement, a school’s achievement percentile would be expected to be similar to its SFO percentile. In effect, the SFO percentile range is used as a predictor of a school’s test results.

The exact average test scores and the actual SFO density are not directly reported to schools, each being converted to a percentile rank. Nor is the SFO percentile of individual school presented. The Department of Education provides access to the data for individual schools but it is not publicly available. Schools are not required to report student achievement against other like schools.

Individual schools can compare their performance to that predicted by their SFO by plotting the percentile of each absolute average test score against the SFO density of schools whose SFO percentile range is +/-10% of that of their own school [see Attachment 1]. For example, a school’s Year 3 average reading score may be at the 20th percentile, meaning that 20% of schools have a lower average score, that is, the school has a higher average test score in Year 3 reading than 20% of all schools. If the SFO percentile range of like schools is 12-32%, the school could said to be performing at the level predicted by the SFO percentile range of like schools. However, if the SFO percentile range of like schools is 25-45%, the school could said to be performing below the level predicted by the SFO percentile range of like schools.

Western Australia

Like school comparisons of student achievement are provided to Western Australian government schools as part of what is called the ‘Data Club’. The Data Club was developed as a program to enable school principals and teachers to analyse their students’ achievement scores on the Western Australian Literacy and Numeracy Assessment (WALNA). It allows schools to compare their performance with that of like schools and to identify the value-added component of their school. The information is only provided to schools, but many of them include broad comparative results in school annual reports.

Schools are grouped into ‘like-school’ bands based on their socio-economic index score. There are nine bands. The most disadvantaged schools are in Band 0 and the most advantaged schools are in Band 8. Band 4 schools are ‘average’.

Schools are provided with their mean and median scores, as well as the distribution of scores, for the WALNA tests in literacy and numeracy for Years 3, 5 and 7 together with those for their like school group, schools in their District and with the State as a whole. They are also provided with their average scores over time. A set of charts shows a school’s average scores and the distribution and range of scores compared with their like school group, District and State. This enables schools to compare the difference between their highest and lowest scores with that of other schools as well as the extent to which student scores are clustered around the school average. Many schools currently report this information in their school annual reports.

Like schools are determined on the basis of their score on a socio-economic index constructed by the Department of Education. The index is based on ABS Census data at the CD level and on the actual schools’ percentages of Aboriginal students. It is comprised of the double weighted dimensions of education, occupation and Aboriginality, and the single weighted dimensions of family income and single parent families.

The SES score for each school is derived by geo-coding the addresses of students at each school to CDs. The index was initially calculated using a sample of addresses in each school but now all student addresses are used. The number of students for each CD is determined and the ABS Census data is used to calculate an SES score for each CD by combining index scores for each dimension. The SES score for each school is obtained by weighting each CD SES score by the number of students resident in each CD and taking account of the percentage of Aboriginal students in each school.

Trevor Cobbold

Previous Next

Media Release 9 October 2008 – Gillard Should Come Clean on School Reports

Wednesday October 8, 2008

Save Our Schools, a Canberra-based public education advocacy group, today called on the Education Minister, Julia Gillard, to release details of her controversial school performance reporting plan.

SOS spokesman, Trevor Cobbold, said that the imminent visit of New York Schools Chancellor, Joel Klein, is an opportunity for informed public debate about school reports.

“Julia Gillard wants schools to be open and transparent about their performance. Yet, she is not applying the same standard to herself. She has restricted public information and debate about her proposal. It is all being decided under the cloak of secrecy.

“It is time for Gillard to come clean and reveal the details of what she proposes for Australian schools.

“Klein is being brought to Australia to tout New York’s school progress reports. Let us have an informed debate while Klein is here and not just a one-sided presentation to bolster Gillard’s secret negotiations with State and Territory Governments.

“Parents, teachers and the public are entitled to know what school performance information will be made public and how schools will be compared. They need to be able to assess whether the information can be used to construct misleading league tables, whether it will actually reflect school performance rather than family social background and whether the information is statistically valid and reliable.

“Gillard says she that she rejects ‘simplistic and silly’ league tables and wants to compare ‘like schools’. However, the Klein model that so ‘impresses’ her fails both tests.

“The New York system reports the performance scores of all schools, thus making it possible to create school league tables. Many of its so-called ‘school peer groups’ are very un-alike in their social composition.”

Mr Cobbold said that the Education Minister’s refusal to provide the details of her proposal contradicts the Prime Minister’s promise of open government.

“It seems that it is all being decided behind closed doors with the axe of Commonwealth funding held over the heads of State and Territory Governments to ensure compliance. What a way to conduct the open government promised by the Prime Minister!”

“The Prime Minister’s message clearly has not got through to his Deputy Prime Minister and Education Minister. Instead, she is taking her cues from her champion, Joel Klein, on how to force through controversial measures without public debate.

“Secrecy and avoidance of public debate are characteristic of how Klein has implemented change in New York’s schools. There too, teacher and parent organisations were excluded from the process. Gillard has clearly learned from him.

“Gillard needs to demonstrate that she is as open and transparent as she wants schools to be. She should release the details of her proposal for school progress reports and invite public discussion. We should have real public debate over proposals before implementation. This is what open government means.”

9 October 2008

Contact: Trevor Cobbold 0410 121 640 (m)

Previous Next

Gillard’s School Reporting Model is a Triumph of Ideology over Evidence

Sunday August 31, 2008

The Rudd Government’s “education revolution” is looking more and more like an extension of the Howard Government’s school policies. All the same elements are there – choice and competition, reliance on markets, and now public reporting of school results.

The model for the new school reporting scheme comes direct from New York. Julia Gillard has been enthusing about the New York system ever since her audience with the New York Schools Chancellor, Joe Klein. She says she is “inspired” and “impressed” by Klein’s model.

It is a pity that Gillard did not look more closely. She would have seen major flaws.

The New York system produces unreliable and misleading comparisons of school performance and student progress. It is incoherent. It can be used to produce league tables. It fails to compare like with like and it is statistically flawed.

Diane Ravitch, Professor of Education at New York University, a former US Assistant Secretary of Education and advocate of school reporting now says that New York’s school reporting system is “inherently unreliable”, “dubious” and produces “bizarre results”.

Jennifer Jennings from Columbia University describes it as “statistical malpractice”, “a mess”, and based on “highly questionable methods”. The New York Sun columnist, Andrew Wolf, says that it is “an overblown grading system that already seems to be sinking from its own weight”.

New York uses an incredibly complicated scoring system, requiring two 30-page technical guides to explain. It combines a wide range of information on student achievement, student progress, student composition and school features to obtain a school grade of ‘A’ to ‘F’ and an overall performance score out of 100.

The process by which all this information is combined, weighted and assessed involves many highly arbitrary and subjective judgements. According to Andrew Wolf, it involves “a bagful of subjective adjustments, bonus points and bureaucratic discretions”. It is riddled with inconsistencies.

Amongst the most bizarre results of the system is that high performing schools can be assessed as failing and be closed down. For example, the elementary school PS 35 on Staten Island was graded last year as failing even though 85% of its students passed the reading test and 98% passed the mathematics test.

It was failed because its students had shown insufficient improvement from 2006. As a result, it is a candidate to be closed if it fails to improve further.

Julia Gillard says that she doesn’t want “simplistic and silly” league tables, and will only compare schools with a similar student population. This is disingenuous. It is not possible to use the New York model to report student results in like schools without providing the scores for all schools. The New York Times and the New York Post list each school’s grade and overall performance scores. It is a simple matter to rank all schools on their grade and scores in league tables.

Despite what Gillard says about the New York system, it fails to consistently compare like with like. Jennifer Jennings has pointed out that school peer groups include schools with very dissimilar demographic profiles. For example, the percentage of high-achieving Asian students in the schools of one peer group ranges from 1% to 69%. In another, the percentage of low income students ranges from 12% to 94%.

Another major problem is that New York school progress reports do not report measurement errors for school scores and grades. As a result, its comparisons of school performance are likely to be inaccurate and misleading.

Many studies of school performance reporting in England, the US and Australia have shown that a large proportion of school results are statistically indistinguishable when measurement error is taken into account. The problem is magnified for measures of student progress, or ‘value added’ comparisons, where measurement error is inevitably larger.

Astoundingly, Gillard’s preferred model assesses student progress on only one year’s data. Yet, a study published by the US National Bureau of Economic Research shows that 50 to 80 percent of the year-to-year fluctuation fluctuations in average school test scores are random and have nothing to do with school quality. School comparisons of progress over one year are therefore highly unreliable.

These and many other criticisms mean that the New York reporting system is deeply flawed. Any system based on it will severely mislead the public and parents.

Its adoption will subject school principals and staff to substantial risks of being punished or rewarded on the basis of dubious and unreliable data and for factors beyond their control. It will not accurately identify best practice in schools as Gillard wants.

State and Territory Governments would be well advised to reject the New York model. It will only do harm to a largely successful education system.

Australia and Finland are two of the highest achieving countries in the world in school outcomes according to the PISA surveys conducted by the OECD. Neither country got there by reporting school results.

Why the Rudd Government is choosing to emulate the reporting policies of much lower performing countries like the United States and England can only be explained as a triumph of ideology over evidence.

Trevor Cobbold
Convenor, Save Our Schools
Save Our Schools is a Canberra-based public education advocacy group.

Previous Next

Choice or Equity in Education?

Saturday August 23, 2008

This paper, by Trevor Cobbold, was delivered to the Education Summit in Sydney in June 2008.

It argues that choice has failed the promise of its advocates to improve education outcomes and that it has not only deflected education systems from dealing with the major challenge of inequity in education, but has exacerbated inequity.

The result has been to reinforce privilege in education. The paper further argues that while choice in any education system is inevitable and cannot be denied, it should be strictly controlled in order to give priority to improving equity in education. The paper also sets out some fundamental steps to improve equity in education in Australia.

Choice or Equity in Education.pdf

Next

Media Release 30 May 2008 – Rich Families Benefit Most from Over-Funding of Private Schools

Thursday May 29, 2008

A study of the SES funding model for private schools released today shows that two-thirds of all private school students are over-funded and that schools serving the wealthiest families are vastly more over-funded than those serving low income families.

The study was done by Save Our Schools, a public education advocacy group based in Canberra.

Trevor Cobbold, SOS spokesman and co-author of the study, said that the analysis demonstrates that private school funding is in need of urgent revision.

“Current Australian Government funding of private schools is incoherent and capricious.

“The SES funding model being continued by the Rudd Government delivers more than $2 billion in over-funding over four years to some of the wealthiest parents in Australia, supporting them to send their children to some of the most elite schools in Australia. In contrast, the poorest private schools get no over-funding.

“It provides preferential treatment of schools associated with one religious group, and major disparities in funding between states. In some cases, there are as many as 7 or 9 different funding levels for schools on the same SES score.”

Mr. Cobbold said that the study has revealed several new aspects of the SES funding arrangements.

“The study shows that the extent of over-funding of private schools is much higher than previously thought:

  • 64% of all private school students are over-funded;
  • 70% of all Catholic systemic school students are over-funded;
  • 56% of Independent school students are over-funded.

“The top 20 over-funded primary schools in Australia received average over-funding of between $2534 and $3072 per student per year during 2005-2007. The top 20 over-funded secondary schools received average over-funding of between $2485 and $3306 per student per year.

“Catholic and Independent schools serving the wealthiest families receive the highest amounts of over-funding per student per year:

  • Catholic primary schools in the highest SES score range of 126-134 were over-funded by $2923 per student;
  • Catholic secondary schools in the score range of 116-125 were over-funded by $2738 per student (there were no Catholic systemic secondary schools in the score range of 126-134);
  • Independent FM primary schools serving the highest income families were over-funded by $602 per student;
  • Independent secondary FM secondary schools were over-funded by $822 per student;
  • Catholic and Independent schools serving the poorest families did not receive any over-funding.

“The study also shows that the extent of inequality in funding schools on the same SES score is much more extensive than previously thought. It shows that schools on the same SES score have several different levels of funding per student. For example, there are 9 different levels of funding for schools on the SES score of 116 and 7 different levels of funding for schools on the SES scores of 109, 114 and 118.

Mr. Cobbold said that the study shows that the SES funding model being continued by the Rudd Government until 2012 is illogical and unfair.

“The SES model is not delivering a systematic, consistent and fair funding allocation system for private schools. It also provides significant levels of government funding to wealthy private schools whose total funding (from private and government sources) is well above the average for government schools.

“The model is in need of urgent revision to better take account of the differing social roles of private and government schools and to better take account of differing levels of student learning needs in schools.”

The estimates used in the SOS study are derived from school funding data provided to the Senate Estimates Committee by the Department of Education, Science and Technology in November 2006. See answer to Question on Notice E527_07, Attachments A and B. It is available at: http://www.aph.gov.au/Senate/committee/eet_ctte/estimates/sup_0607/dest/index.htm

Contact: Trevor Cobbold 0410 121 640 (m)

Previous Next