The international PISA tests have become “false idols of educational excellence for the world to worship”. They have extraordinary status and influence. Education systems and even nations are judged by test scores and rankings that are assumed to be accurate measures of achievement. A primary assumption is that all students always try their best. A growing literature shows this to be false.
The OECD report on PISA 2018 found that over two-thirds of students in the OECD did not fully try on the tests. Eighty per cent of students in Germany did not fully try as did 79% of students in Denmark and Canada and 78% in Switzerland. Some 73% of Australian and New Zealand students did not fully try. In contrast, 46% of Korean students and 60% of Japanese students did not fully try. This variation calls into question the validity of league tables of countries based on PISA results.
Continue reading “Beware False Idols of Education Excellence”
One factor not considered in the commotion over the
continuing decline in Australia’s PISA results is whether students try their
best on the tests. The OECD’s
own report on PISA 2018 shows that about three in four Australian students and
two-thirds of students in OECD countries did not try their hardest on the tests.
There are also wide differences between countries. It has potentially explosive
implications for the validity of international comparisons of student
achievement based on PISA.
The PISA data also shows increasing student dissatisfaction with school which likely contributes to lack of effort on tests and is a factor, among others, behind Australia’s declining results. There is also a perplexing contradiction between Australia’s declining PISA results and its improving Year 12 results. Lack of effort in PISA may partly explain this because performance on PISA has no consequences for students as they don’t even get their individual results. In contrast, Year 12 outcomes affect the life chances of students and even students dissatisfied with school have greater incentive to try harder. The fact that Australia’s Year 12 results have improved significantly since the early 2000s raises further questions about the reliability of the PISA results.
Continue reading “OECD Says 3 in 4 Australian Students Do Not Try on PISA Tests”
A US high school student takes issue with standardised tests.
Continue reading “A Poem on the Ravages of Standardised Tests”
How often have we all
sat through those frustrating meetings where someone from head office or a
university articulates with such commitment the first lie – if you can’t
measure it then it’s not worth doing.
This quantification of education based on an economically rational
approach started in the sixties. This
was the dawn of outcomes-based learning.
Continue reading “The Dishonourable Lie”
This is a summary of a new Education Research Brief. It can be downloaded below
A much-ignored aspect of school results in Australia over the past decade or more is the sharp contrast between declining or stagnating scores on international and national tests for Years 9 and 10 and solid improvements in Year 12 results. How is it that trends in school outcomes only two or three Year levels apart are so different? Continue reading “Have Kids Stopped Trying on PISA and NAPLAN?”
National literacy and numeracy tests will now have ‘high stakes’ attached to them as a result of the decision of Australian education ministers, at the initiative of the Rudd Government, to publish the results of individual schools.
It means that league tables are now inevitable in Australia. This will put schools under enormous pressure to maintain reputations and enrolments. The future of some schools will also be threatened because the Prime Minister has stated that sanctions will be applied to schools that don’t improve their performance. Continue reading “League Tables Create Incentives for Schools to Rig Their Results”
Last year, the Director of Education at the OECD, Andreas Schleicher, admitted that the switch from pen-and-paper to computer tests for PISA 2015 assessments may have contributed to significant falls in results amongst higher performing countries. A new research paper published by the Centre of Education Economics in the UK provides more evidence for this. Continue reading “Caution Needed in Interpreting PISA 2015 Results”
The introduction of a National Year 1 Literacy and Numeracy Check has been heavily criticised by the Australian Literacy Educators’ Association (ALEA) in a position statement. It says that there is an unreasonable over-emphasis on phonics in the new assessment tool. Continue reading “Does Australia need an assessment tool to measure literacy and numeracy achievement in Year 1 classrooms?”
One of the standout performers in the results from PISA 2015 was Vietnam. It achieved a ranking of 8th in science with a score of 525, which was significantly above Australia’s score of 510. More remarkably, only 6% of its students were below the minimum PISA standard compared to 18% of students in Australia. Vietnam had the smallest proportion of students below the science standard of the 72 countries and economies participating in PISA 2015.
However, there seems to be more than meets the eye in these results because over half of Vietnam’s 15-year-old population was not covered by the PISA sample because they were not in school. Continue reading “PISA Rankings Are Misleading Because of Differences in Student Coverage”
Earlier this week the ABC’s Life Matters program ran a segment on parents taking their children out of the NAPLAN tests. It generated considerable discussion. A listener who is a teacher sent SOS this response to the program.
School leaders and systems do use NAPLAN and recognise NAPLAN because it has now become the universal measure across the country because of its official status as a measure and not because it is valuable per se. It would be a concern if teachers relied on NAPLAN as a diagnostic tool and only relied on NAPLAN given the time lag between the test and the results. Good teachers are constantly assessing students in a number of ways but do not rely on a single standardised test but involve students in continual modes of improvement. Testing is not part of life, it is part of school life. How often does anyone who goes to work have to sit an exam to prove that they are learning, that they are doing their job, and demonstrating all they know in an hour or two? We would say as adults that would be unreasonable. Performance evaluation at work relies on continuous practice. Continue reading “A Teacher’s Comment on NAPLAN”