One factor not considered in the commotion over the continuing decline in Australia’s PISA results is whether students try their best on the tests. The OECD’s own report on PISA 2018 shows that about three in four Australian students and two-thirds of students in OECD countries did not try their hardest on the tests. There are also wide differences between countries. It has potentially explosive implications for the validity of international comparisons of student achievement based on PISA.
The PISA data also shows increasing student dissatisfaction with school which likely contributes to lack of effort on tests and is a factor, among others, behind Australia’s declining results. There is also a perplexing contradiction between Australia’s declining PISA results and its improving Year 12 results. Lack of effort in PISA may partly explain this because performance on PISA has no consequences for students as they don’t even get their individual results. In contrast, Year 12 outcomes affect the life chances of students and even students dissatisfied with school have greater incentive to try harder. The fact that Australia’s Year 12 results have improved significantly since the early 2000s raises further questions about the reliability of the PISA results.
Source: OECD, PISA 2018 Results (Vol 1): What Students Know and Can Do, Online Table I.A8.1.
The OECD report on PISA 2018 shows that 68% of students across the OECD did not fully try [Chart 1]. In Australia, 73% of students did not make full effort. This was the 14th highest proportion out of 36 OECD countries. The report also shows large variation in student effort across countries. Around 80% of students in Germany, Denmark and Canada did not fully try compared to 60% in Japan and 46% in Korea.
The relatively high proportion of Australian students not fully trying may be contributing to its lower results amongst high achieving OECD countries. Five of the six OECD countries with significantly higher reading results than Australia and six out of seven with higher science results had a lower proportion of students not fully trying. These countries included Estonia, Finland, Korea and Poland. However, many countries with significantly higher mathematics results than Australia also had a larger proportion of students not fully trying, including Germany, Denmark, Canada and Switzerland.
The report adds to extensive research evidence on student effort in standardised tests. Many overseas studies over the past 20 years have found that students make less effort in tests that have no or few consequences for them. For example, a study published last year by the US National Bureau of Economic Research (NBER) based on data from PISA 2015 found that a relatively high proportion of students in Australia and other countries did not take the test seriously.
There is also anecdotal evidence to indicate that low student effort is a factor in Australia’s PISA results. For example, one 15-year-old who participated in PISA 2015 said:
My peers who took part in this test were unanimous in that they did not, to the best of their ability, attempt the exam.
I have heard many similar anecdotes from students in Canberra.
Less effort in tests leads to lower results. As the OECD report on PISA 2018 states: “differences in countries’ and economies’ mean scores in PISA may reflect differences not only in what students know and can do, but also in their motivation to do their best” [p.198].
There is little direct evidence of declining student effort in PISA as a factor behind the large decline in Australia’s PISA results since 2000. However, there is evidence of increasing student dissatisfaction at school which might show up in reduced effort and lower results.
Student dissatisfaction at school amongst 15-year-olds in Australia has increased significantly since 2003. The proportion of students who feel unconnected at school increased by 24 percentage points from 8 to 32% between PISA 2003 and 2018. This was the 3rd largest increase in the OECD behind France and the Slovak Republic [Chart 2]. In PISA 2018, Australia had the equal 4th largest proportion of students who feel unconnected with school in the OECD.
Source: OECD, PISA 2015 Results (Volume III): Students’ Well-Being. Tables III 7.4 & III 7.5; OECD, PISA 2018 Results (Volume II): Where All Students Can Succeed, Table II B 1.3.4.
Note: The change is from 2012 for Chile, Estonia, Slovenia and USA.
The large increase in student dissatisfaction at school in Australia may have led to lower motivation and effort in PISA over time. The OECD says that the relationship between a feeling of belonging at school and performance in PISA is strong for those students with the least sense of belonging [OECD 2016, p. 122]. Students who feel they do not belong at school have significantly lower levels of achievement in PISA than those who do feel they belong.
However, the evidence that declining effort on PISA is a factor behind the declining results is only suggestive. An issue with the OECD data on student effort in PISA is that it is based on student self-reporting. There are well-known problems with self-reporting such as how truthful students are about their effort and the extent to which answers provided on subjective response scales can be compared across students and across countries.
The OECD report also draws on other methods to measure student effort based on student behaviour in computer-based tests including measures of “response-time effort”, “test endurance” and the proportion of items not reached in the tests. It found significant inconsistencies between country rankings of effort based on response-time effort and those based on the share of students who reported they would have worked harder on the test if it had counted towards their grades. It suggests that students made more effort on PISA than indicated by self-reports.
Nevertheless, there are also problems with these other measures. For example, the response-time effort measure makes the arbitrary assumption that students who spend more than five seconds on a test item are making a genuine effort. However, this is not necessarily the case as students who spend more time on an item could be just killing time or “switching off” rather than trying to answer. For this reason, reliance on the response-time measure of student effort could well under-estimate the extent to which students do not fully try.
The OECD report is inconclusive about the extent of the effect of student effort on test results and country rankings. However, it does acknowledge that differences in student effort across countries will affect country results and rankings and this is supported by other recent research evidence.
The NBER study noted above also used response time as the measure of student effort and found much lower levels of students not fully trying in PISA 2015 than the self-reporting in 2018. It found that the proportion of non-serious students varies enormously by country from 14% in Korea to 67% in Brazil. It rated 23% of Australian students as non-serious compared to 15-18% in high achieving countries such as Finland (15%), Japan (18%), Korea (14%) and Singapore (17%).
Even these lower proportions of students not fully trying in PISA 2015 had a large impact on the rankings for several countries. For example, the study estimated that Portugal’s ranking in science in PISA 2015 would have improved by 15 places from 31st to 16th if students had fully tried. Sweden’s ranking would have improved 11 places from 33rd to 22nd and Australia’s ranking by four places from 16th to 12th.
If the proportions not fully trying are higher than indicated by the NBER report and more like those self-reported to PISA, there are massive implications for PISA scores in many countries, including Australia. Actual scores are likely to be significantly under-estimated, the declines in scores over-estimated and country rankings on PISA will be massively distorted. It calls into question the validity of PISA as an accurate representation of student achievement within and across countries.
The possibility that student effort on PISA has declined helps explain the contradiction between Australia’s PISA and Year 12 results. Some 75-80 per cent of Australian students participating in PISA are in Year 10. It is perplexing that the PISA results for these students have declined since 2000 while results for students two years later in Year 12 have improved significantly.
The percentage of the estimated Year 12 population that completed Year 12 increased from 68% in 2001 to 79% in 2017 [Report on Government Services 2007, Table 3A.122 & 2019, Table 4A.58]. In 2018, 90% cent of 20 to 24-year-olds had attained Year 12 or Certificate II, up from 79% in 2001 [ABS, Education and Work, 2011, Table 1.1 & 2019, Table 30]. In addition, the Report on Government Services shows that a larger proportion of disadvantaged students now complete Year 12. For example, the percentage of low socio-economic status students who completed Year 12 increased from 64% in 2001 to 76% in 2017.
OECD data also shows that Australia had one of the larger increases in the OECD in the proportion of 25-34 year-olds who attained an upper secondary education. It increased by 18 percentage points from 71% in 2001 to 89% in 2018 compared to the OECD average increase of 11 percentage points [Education at a Glance 2002, Table A1.2 & 2019.Table A1.2].
These are indicators of an improving education system, not a deteriorating one. Part of the explanation for the differing results in PISA and Year 12 is that PISA is a low stakes assessment with no consequences for students or their schools while Year 12 is a high stakes assessment for even disconnected students because it has a major influence on future careers and lives.
Thus, there is credible evidence that at least a significant proportion, if not a large proportion, of Australian students did not fully try on PISA 2018. It is likely to be a factor contributing to Australia’s relatively poor performance compared to other high achieving countries. It is also possible that increasing student dissatisfaction since 2003 has led to an increasing proportion of students not fully trying, thereby contributing to the decline in Australia’s results. It suggests that the doom and gloom about Australia’s latest PISA results are misplaced and any policy responses should be based on a more comprehensive analysis of the factors behind these results.