Beware False Idols of Education Excellence

The international PISA tests have become “false idols of educational excellence for the world to worship”. They have extraordinary status and influence. Education systems and even nations are judged by test scores and rankings that are assumed to be accurate measures of achievement. A primary assumption is that all students always try their best. A growing literature shows this to be false.

The OECD report on PISA 2018 found that over two-thirds of students in the OECD did not fully try on the tests. Eighty per cent of students in Germany did not fully try as did 79% of students in Denmark and Canada and 78% in Switzerland. Some 73% of Australian and New Zealand students did not fully try. In contrast, 46% of Korean students and 60% of Japanese students did not fully try. This variation calls into question the validity of league tables of countries based on PISA results.

There is a growing research literature showing that test-taking motivation has a profound effect on student results in low-stakes tests such as PISA. Students don’t have much incentive to perform in such tests because there are no personal consequences depending on their results. Students and their parents are not given the results and which have no effect on future academic careers at school. Different groups of students may also have different levels of motivation and may also bias comparisons of achievement.

A new paper published in the Educational Research Review reviewed the results from 28 studies of test taking effort and test performance conducted between 2005 and 2018. Most of the studies were conducted using low stakes tests. Nearly all found a statistically significant positive effect between test-taking effort and test results. Higher motivation and effort led to higher results and low motivation and effort led to lower results.

Most studies measured the level of motivation by student self-reporting using questionnaires after the tests. A few studies measured it by the response time in answering questions – a higher percentage of rapid response times being taken to indicate guesses and lower effort. The review found that response time was a much better predictor of test performance than self-reporting. It said that the reasons for this difference are not clear.

These results complement those of a 2005 meta-analysis of studies which compared results in low stakes tests with those of high stakes tests such as those required for completion of Year 12. It found that student’s results in low stakes tests are lower than in high stakes tests.

These findings have major implications for international comparisons of test results such as PISA. As the OECD acknowledged in its PISA 2018 report, differences in student effort across countries will affect country results and rankings [p. 198]. A study published by the US National Bureau of Economic Research based on data from PISA 2015 found that differences in the proportion of students not fully trying had a large impact on the rankings for several countries. For example, it estimated that Portugal’s ranking in science in PISA 2015 would have improved by 15 places from 31st to 16th if students had fully tried. Sweden’s ranking would have improved 11 places from 33rd to 22nd and Australia’s ranking by four places from 16th to 12th.

A significant gap in the current research on test motivation is the extent to which it changes over time in different countries and how it affects trends in the results of low stakes tests such as PISA. Several OECD countries, including Australia, have seen significant declines in PISA scores since 2000 and reduced student motivation may be a factor. There is no direct evidence of this, but there is indirect evidence to suggest it is a factor among others.

The PISA results show increasing student dissatisfaction at school across OECD countries which may show up in reduced effort and lower results. For example, student dissatisfaction at school amongst 15-year-olds in Australia has increased significantly since 2000. The proportion of students who feel unconnected with school increased fourfold from 8% to 32% between PISA 2003 and 2018. This was the 3rd largest increase in the OECD behind France and the Slovak Republic.

The fact that one-third of Australian students are dissatisfied with school is likely to manifest in low motivation and effort in low stakes tests. The OECD says that the relationship between a feeling of belonging at school and performance in PISA is strong for those students with the least sense of belonging [OECD 2016, p. 122]. Students who feel they do not belong at school have significantly lower levels of achievement in PISA than those who do feel they belong.

Australia is not the only country that had declining PISA results and increasing student dissatisfaction with school. Average PISA results for OECD countries have fallen since 2000 while the proportion of students who feel they don’t belong at school increased threefold from 7% to 22%. Of 30 countries for which data is available, all experienced an increase in student dissatisfaction at school and PISA maths results fell in 21 of them (see Chart). New Zealand experienced similar increases in student dissatisfaction and similar large declines in PISA results to Australia. Canada, France and the Czech Republic also experienced large increases in student dissatisfaction and significant falls in PISA maths scores.

Sources: PISA 2015, Vol 3, Table III 7.4; PISA 2018 Vol 2, Table II B 1.3.4; PISA 2018 Country Overviews.

In contrast, several countries including Austria, Denmark, Estonia and Japan had only small increases in student dissatisfaction and small statistically significant changes in their PISA scores.

This is not to say that student dissatisfaction with school is the only factor contributing to declining PISA results. There is obviously not a one-to-one relationship. For example, some countries such Finland, Korea and Netherlands had small increases in student dissatisfaction and large declines in PISA scores. Italy, Poland and Portugal achieved large increases in PISA scores despite modest increases in student dissatisfaction.

Clearly, other factors also influence test scores such as economic inequality, school funding, immigration, and shortages of teachers and learning materials. For example, the decline in PISA results for Finland is similar to that for Australia, but the increase in student dissatisfaction at school was half that in Australia. A large increase in the immigrant population and cuts in school funding may have contributed to the decline in Finland. Australia’s results are likely to be affected by shortages of qualified teachers. For example, about 20% of maths teachers in Australia are not qualified to teach maths.

Pasi Sahlberg, Professor of Education at the Gonski Institute of Education, says that digital devices are a growing distraction to students in many countries, including Australia, Canada and Finland. A report published by the Gonski Institute shows that more prolific use of technology by students has presented a broad range of challenges to student wellbeing and health. Sahlberg says that “young people are not what they used to be” and it may be that they are unable to engage in cognitive and social tasks as well as in the past.

The possibility that student effort on PISA has declined helps explain the contradiction between Australia’s PISA and Year 12 results. Some 75-80 per cent of Australian students participating in PISA are in Year 10. It is perplexing that the PISA results for these students have declined since 2000 while results for students two years later in Year 12 have improved significantly. The percentage of the estimated Year 12 population that completed Year 12 increased from 68% in 2001 to 79% in 2018 [Report on Government Services 2007, Table 3A.122 & 2020, Table 4A.70]. Year 12 assessments are high stakes in comparison to PISA even for less motivated students because these assessments have personal consequences for future education and work opportunities.

Low stakes tests should not be turned into high stakes tests to increase student effort. High stakes tests harm education. They narrow the curriculum as more time is devoted to tested subjects while untested subjects receive much less time. They turn classrooms into test preparation factories. They also reduce collaboration between schools about successful teaching practices and increase social segregation between schools.

Much more detailed research is needed to establish a causal relationship between low motivation and effort, student dissatisfaction with school and declining PISA results in Australia and other OECD countries. However, the results of the 28 studies reviewed in the new paper strongly indicate that international and national test results are significantly affected by differences in student motivation and effort. The results could be as much a measure of student effort as a measure of student learning. Caution is necessary in interpreting results and drawing strong policy conclusions from these “false idols of educational excellence”.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.